Dec 16 12:17:41.460196 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 16 12:17:41.460221 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Dec 16 00:05:24 -00 2025 Dec 16 12:17:41.460232 kernel: KASLR enabled Dec 16 12:17:41.460238 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Dec 16 12:17:41.460244 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Dec 16 12:17:41.460250 kernel: random: crng init done Dec 16 12:17:41.460257 kernel: secureboot: Secure boot disabled Dec 16 12:17:41.460263 kernel: ACPI: Early table checksum verification disabled Dec 16 12:17:41.460269 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Dec 16 12:17:41.460276 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Dec 16 12:17:41.460282 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:17:41.460288 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:17:41.460294 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:17:41.460300 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:17:41.460309 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:17:41.460315 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:17:41.460322 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:17:41.460328 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:17:41.460335 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:17:41.460341 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 12:17:41.460347 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 16 12:17:41.460354 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 16 12:17:41.460360 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Dec 16 12:17:41.460368 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Dec 16 12:17:41.460374 kernel: Zone ranges: Dec 16 12:17:41.460381 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 16 12:17:41.460387 kernel: DMA32 empty Dec 16 12:17:41.460393 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Dec 16 12:17:41.460399 kernel: Device empty Dec 16 12:17:41.460406 kernel: Movable zone start for each node Dec 16 12:17:41.460412 kernel: Early memory node ranges Dec 16 12:17:41.460418 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Dec 16 12:17:41.460425 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Dec 16 12:17:41.460431 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Dec 16 12:17:41.460438 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Dec 16 12:17:41.460445 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Dec 16 12:17:41.460451 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Dec 16 12:17:41.460458 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Dec 16 12:17:41.460464 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Dec 16 12:17:41.460470 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Dec 16 12:17:41.460480 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Dec 16 12:17:41.460488 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Dec 16 12:17:41.460495 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 16 12:17:41.460501 kernel: psci: probing for conduit method from ACPI. Dec 16 12:17:41.460508 kernel: psci: PSCIv1.1 detected in firmware. Dec 16 12:17:41.460515 kernel: psci: Using standard PSCI v0.2 function IDs Dec 16 12:17:41.460521 kernel: psci: Trusted OS migration not required Dec 16 12:17:41.460528 kernel: psci: SMC Calling Convention v1.1 Dec 16 12:17:41.460535 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 16 12:17:41.460543 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 16 12:17:41.460550 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 16 12:17:41.460572 kernel: pcpu-alloc: [0] 0 [0] 1 Dec 16 12:17:41.460580 kernel: Detected PIPT I-cache on CPU0 Dec 16 12:17:41.460587 kernel: CPU features: detected: GIC system register CPU interface Dec 16 12:17:41.460594 kernel: CPU features: detected: Spectre-v4 Dec 16 12:17:41.460600 kernel: CPU features: detected: Spectre-BHB Dec 16 12:17:41.460607 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 16 12:17:41.460614 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 16 12:17:41.460641 kernel: CPU features: detected: ARM erratum 1418040 Dec 16 12:17:41.460648 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 16 12:17:41.460657 kernel: alternatives: applying boot alternatives Dec 16 12:17:41.460665 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 12:17:41.460672 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Dec 16 12:17:41.460679 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 12:17:41.460686 kernel: Fallback order for Node 0: 0 Dec 16 12:17:41.460693 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Dec 16 12:17:41.460699 kernel: Policy zone: Normal Dec 16 12:17:41.460706 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 12:17:41.460713 kernel: software IO TLB: area num 2. Dec 16 12:17:41.460720 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 16 12:17:41.460728 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 16 12:17:41.460735 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 12:17:41.460742 kernel: rcu: RCU event tracing is enabled. Dec 16 12:17:41.460749 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 16 12:17:41.460756 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 12:17:41.460763 kernel: Tracing variant of Tasks RCU enabled. Dec 16 12:17:41.460770 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 12:17:41.460777 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 16 12:17:41.460784 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:17:41.460791 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:17:41.460798 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 16 12:17:41.460807 kernel: GICv3: 256 SPIs implemented Dec 16 12:17:41.460814 kernel: GICv3: 0 Extended SPIs implemented Dec 16 12:17:41.460821 kernel: Root IRQ handler: gic_handle_irq Dec 16 12:17:41.460828 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 16 12:17:41.460835 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 16 12:17:41.460841 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 16 12:17:41.460848 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 16 12:17:41.460855 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Dec 16 12:17:41.460862 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Dec 16 12:17:41.460869 kernel: GICv3: using LPI property table @0x0000000100120000 Dec 16 12:17:41.460876 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Dec 16 12:17:41.460884 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 12:17:41.460891 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:17:41.460898 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 16 12:17:41.460905 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 16 12:17:41.460912 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 16 12:17:41.460919 kernel: Console: colour dummy device 80x25 Dec 16 12:17:41.460926 kernel: ACPI: Core revision 20240827 Dec 16 12:17:41.460933 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 16 12:17:41.460941 kernel: pid_max: default: 32768 minimum: 301 Dec 16 12:17:41.460949 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 12:17:41.460956 kernel: landlock: Up and running. Dec 16 12:17:41.460964 kernel: SELinux: Initializing. Dec 16 12:17:41.460971 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 16 12:17:41.460978 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 16 12:17:41.460985 kernel: rcu: Hierarchical SRCU implementation. Dec 16 12:17:41.460992 kernel: rcu: Max phase no-delay instances is 400. Dec 16 12:17:41.461000 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 12:17:41.461008 kernel: Remapping and enabling EFI services. Dec 16 12:17:41.461015 kernel: smp: Bringing up secondary CPUs ... Dec 16 12:17:41.461023 kernel: Detected PIPT I-cache on CPU1 Dec 16 12:17:41.461030 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 16 12:17:41.461037 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Dec 16 12:17:41.461044 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:17:41.461051 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 16 12:17:41.461059 kernel: smp: Brought up 1 node, 2 CPUs Dec 16 12:17:41.461067 kernel: SMP: Total of 2 processors activated. Dec 16 12:17:41.461079 kernel: CPU: All CPU(s) started at EL1 Dec 16 12:17:41.461088 kernel: CPU features: detected: 32-bit EL0 Support Dec 16 12:17:41.461095 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 16 12:17:41.461103 kernel: CPU features: detected: Common not Private translations Dec 16 12:17:41.461110 kernel: CPU features: detected: CRC32 instructions Dec 16 12:17:41.461118 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 16 12:17:41.461134 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 16 12:17:41.461143 kernel: CPU features: detected: LSE atomic instructions Dec 16 12:17:41.461151 kernel: CPU features: detected: Privileged Access Never Dec 16 12:17:41.461158 kernel: CPU features: detected: RAS Extension Support Dec 16 12:17:41.461166 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 16 12:17:41.461173 kernel: alternatives: applying system-wide alternatives Dec 16 12:17:41.461184 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Dec 16 12:17:41.461192 kernel: Memory: 3885924K/4096000K available (11200K kernel code, 2456K rwdata, 9084K rodata, 12480K init, 1038K bss, 188596K reserved, 16384K cma-reserved) Dec 16 12:17:41.461200 kernel: devtmpfs: initialized Dec 16 12:17:41.461208 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 12:17:41.461215 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 16 12:17:41.461223 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 16 12:17:41.461231 kernel: 0 pages in range for non-PLT usage Dec 16 12:17:41.461240 kernel: 515168 pages in range for PLT usage Dec 16 12:17:41.461247 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 12:17:41.461254 kernel: SMBIOS 3.0.0 present. Dec 16 12:17:41.461262 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Dec 16 12:17:41.461270 kernel: DMI: Memory slots populated: 1/1 Dec 16 12:17:41.461277 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 12:17:41.461285 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Dec 16 12:17:41.461294 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 16 12:17:41.461301 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 16 12:17:41.461309 kernel: audit: initializing netlink subsys (disabled) Dec 16 12:17:41.461317 kernel: audit: type=2000 audit(0.013:1): state=initialized audit_enabled=0 res=1 Dec 16 12:17:41.461324 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 12:17:41.461332 kernel: cpuidle: using governor menu Dec 16 12:17:41.461339 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 16 12:17:41.461348 kernel: ASID allocator initialised with 32768 entries Dec 16 12:17:41.461356 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 12:17:41.461363 kernel: Serial: AMBA PL011 UART driver Dec 16 12:17:41.461371 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 12:17:41.461378 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 12:17:41.461386 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 16 12:17:41.461393 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 16 12:17:41.461401 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 12:17:41.461410 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 12:17:41.461417 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 16 12:17:41.461425 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 16 12:17:41.461432 kernel: ACPI: Added _OSI(Module Device) Dec 16 12:17:41.461440 kernel: ACPI: Added _OSI(Processor Device) Dec 16 12:17:41.461447 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 12:17:41.461455 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 12:17:41.461463 kernel: ACPI: Interpreter enabled Dec 16 12:17:41.461471 kernel: ACPI: Using GIC for interrupt routing Dec 16 12:17:41.461478 kernel: ACPI: MCFG table detected, 1 entries Dec 16 12:17:41.461486 kernel: ACPI: CPU0 has been hot-added Dec 16 12:17:41.461494 kernel: ACPI: CPU1 has been hot-added Dec 16 12:17:41.461501 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 16 12:17:41.461509 kernel: printk: legacy console [ttyAMA0] enabled Dec 16 12:17:41.461518 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 12:17:41.463378 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 12:17:41.463496 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 16 12:17:41.463578 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 16 12:17:41.464481 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 16 12:17:41.464578 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 16 12:17:41.464596 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 16 12:17:41.464604 kernel: PCI host bridge to bus 0000:00 Dec 16 12:17:41.464731 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 16 12:17:41.464810 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 16 12:17:41.464882 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 16 12:17:41.464954 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 12:17:41.465061 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 16 12:17:41.465172 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Dec 16 12:17:41.465263 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Dec 16 12:17:41.465346 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Dec 16 12:17:41.465436 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:17:41.465522 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Dec 16 12:17:41.465602 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:17:41.466804 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:17:41.466901 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 16 12:17:41.466996 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:17:41.467081 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Dec 16 12:17:41.467193 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:17:41.467279 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:17:41.467369 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:17:41.467450 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Dec 16 12:17:41.467530 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:17:41.467613 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:17:41.468803 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 16 12:17:41.468903 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:17:41.468986 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Dec 16 12:17:41.469066 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:17:41.469159 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:17:41.469247 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 16 12:17:41.469341 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:17:41.469422 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Dec 16 12:17:41.469502 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:17:41.469581 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:17:41.470764 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 16 12:17:41.470880 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:17:41.470963 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Dec 16 12:17:41.471044 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:17:41.471124 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:17:41.471227 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 16 12:17:41.471316 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:17:41.471401 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Dec 16 12:17:41.471481 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:17:41.471561 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:17:41.473503 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Dec 16 12:17:41.473646 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:17:41.473738 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Dec 16 12:17:41.473828 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:17:41.473909 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:17:41.474003 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:17:41.474086 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Dec 16 12:17:41.474188 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:17:41.474275 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:17:41.474371 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Dec 16 12:17:41.474468 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Dec 16 12:17:41.476208 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:17:41.476316 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Dec 16 12:17:41.476407 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 16 12:17:41.476489 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 12:17:41.476584 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 12:17:41.476690 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Dec 16 12:17:41.476788 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Dec 16 12:17:41.476871 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Dec 16 12:17:41.476957 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 16 12:17:41.477053 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:17:41.477176 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 16 12:17:41.477279 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:17:41.477365 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Dec 16 12:17:41.477450 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 16 12:17:41.477543 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Dec 16 12:17:41.477646 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Dec 16 12:17:41.477735 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 16 12:17:41.477826 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:17:41.477910 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Dec 16 12:17:41.477995 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Dec 16 12:17:41.478077 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 12:17:41.478175 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 16 12:17:41.478260 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:17:41.478339 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:17:41.478425 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 16 12:17:41.478509 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 16 12:17:41.478589 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 16 12:17:41.478757 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 12:17:41.478886 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:17:41.478968 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:17:41.479054 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 12:17:41.479161 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 16 12:17:41.479249 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 16 12:17:41.479333 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 12:17:41.479413 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:17:41.479492 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:17:41.479579 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 12:17:41.479767 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:17:41.479859 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:17:41.479944 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 12:17:41.480027 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Dec 16 12:17:41.480108 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Dec 16 12:17:41.480255 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 12:17:41.480343 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:17:41.480423 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:17:41.480507 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 12:17:41.480587 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:17:41.480712 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:17:41.480802 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 16 12:17:41.480888 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 16 12:17:41.481054 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 16 12:17:41.481174 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 16 12:17:41.481266 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 16 12:17:41.481350 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 16 12:17:41.481443 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 16 12:17:41.481525 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 16 12:17:41.481610 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 16 12:17:41.481732 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 16 12:17:41.481814 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 16 12:17:41.481895 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 16 12:17:41.482039 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 16 12:17:41.482253 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 16 12:17:41.482364 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 16 12:17:41.482448 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 16 12:17:41.482529 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 16 12:17:41.482902 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 16 12:17:41.483013 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Dec 16 12:17:41.483097 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Dec 16 12:17:41.483271 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Dec 16 12:17:41.483361 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:17:41.483446 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Dec 16 12:17:41.483536 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:17:41.483637 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Dec 16 12:17:41.483724 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:17:41.484288 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Dec 16 12:17:41.484382 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:17:41.484473 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Dec 16 12:17:41.484553 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:17:41.484716 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Dec 16 12:17:41.484808 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:17:41.484891 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Dec 16 12:17:41.484970 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:17:41.485051 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Dec 16 12:17:41.485202 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:17:41.485292 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Dec 16 12:17:41.485373 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:17:41.485494 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Dec 16 12:17:41.485593 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 16 12:17:41.485695 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 16 12:17:41.485785 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 16 12:17:41.485866 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:17:41.485989 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Dec 16 12:17:41.486078 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:17:41.486173 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:17:41.486265 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 16 12:17:41.486350 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:17:41.486431 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Dec 16 12:17:41.486513 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:17:41.486594 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:17:41.486737 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 16 12:17:41.486824 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 16 12:17:41.486910 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:17:41.486999 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Dec 16 12:17:41.487081 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:17:41.487200 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:17:41.487296 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 16 12:17:41.487383 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:17:41.487464 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Dec 16 12:17:41.487544 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:17:41.487634 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:17:41.488800 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 16 12:17:41.488896 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 16 12:17:41.488986 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:17:41.489067 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Dec 16 12:17:41.489224 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:17:41.489311 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:17:41.489401 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 16 12:17:41.489484 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 16 12:17:41.489572 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:17:41.490757 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Dec 16 12:17:41.490857 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:17:41.490941 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:17:41.491034 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Dec 16 12:17:41.491122 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Dec 16 12:17:41.491273 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Dec 16 12:17:41.491361 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:17:41.491450 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Dec 16 12:17:41.491534 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:17:41.491614 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:17:41.491732 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:17:41.491815 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Dec 16 12:17:41.491896 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:17:41.491977 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:17:41.492062 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:17:41.492163 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Dec 16 12:17:41.492249 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:17:41.492331 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:17:41.492418 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 16 12:17:41.492492 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 16 12:17:41.492568 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 16 12:17:41.493764 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Dec 16 12:17:41.493859 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 16 12:17:41.493936 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:17:41.494021 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Dec 16 12:17:41.494104 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 16 12:17:41.494231 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:17:41.494322 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Dec 16 12:17:41.494398 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 16 12:17:41.494473 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:17:41.494557 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Dec 16 12:17:41.494661 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 16 12:17:41.494741 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:17:41.494832 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Dec 16 12:17:41.494909 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 16 12:17:41.494984 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:17:41.495070 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Dec 16 12:17:41.495169 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 16 12:17:41.495255 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:17:41.495340 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Dec 16 12:17:41.495417 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 16 12:17:41.495492 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:17:41.495577 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Dec 16 12:17:41.496738 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 16 12:17:41.496836 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:17:41.496924 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Dec 16 12:17:41.497001 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 16 12:17:41.497084 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:17:41.497095 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 16 12:17:41.497103 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 16 12:17:41.497112 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 16 12:17:41.497120 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 16 12:17:41.497168 kernel: iommu: Default domain type: Translated Dec 16 12:17:41.497178 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 16 12:17:41.497191 kernel: efivars: Registered efivars operations Dec 16 12:17:41.497199 kernel: vgaarb: loaded Dec 16 12:17:41.497207 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 16 12:17:41.497215 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 12:17:41.497223 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 12:17:41.497231 kernel: pnp: PnP ACPI init Dec 16 12:17:41.497351 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 16 12:17:41.497367 kernel: pnp: PnP ACPI: found 1 devices Dec 16 12:17:41.497375 kernel: NET: Registered PF_INET protocol family Dec 16 12:17:41.497383 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Dec 16 12:17:41.497391 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Dec 16 12:17:41.497400 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 12:17:41.497408 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:17:41.497416 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Dec 16 12:17:41.497426 kernel: TCP: Hash tables configured (established 32768 bind 32768) Dec 16 12:17:41.497434 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 16 12:17:41.497442 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 16 12:17:41.497450 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 12:17:41.497547 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 16 12:17:41.497559 kernel: PCI: CLS 0 bytes, default 64 Dec 16 12:17:41.497567 kernel: kvm [1]: HYP mode not available Dec 16 12:17:41.497578 kernel: Initialise system trusted keyrings Dec 16 12:17:41.497586 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Dec 16 12:17:41.497595 kernel: Key type asymmetric registered Dec 16 12:17:41.497603 kernel: Asymmetric key parser 'x509' registered Dec 16 12:17:41.497611 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 16 12:17:41.498819 kernel: io scheduler mq-deadline registered Dec 16 12:17:41.498834 kernel: io scheduler kyber registered Dec 16 12:17:41.498847 kernel: io scheduler bfq registered Dec 16 12:17:41.498857 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 16 12:17:41.498979 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Dec 16 12:17:41.499065 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Dec 16 12:17:41.499165 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:17:41.499257 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Dec 16 12:17:41.499380 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Dec 16 12:17:41.499470 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:17:41.499558 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Dec 16 12:17:41.499652 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Dec 16 12:17:41.499757 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:17:41.499849 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Dec 16 12:17:41.499945 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Dec 16 12:17:41.500031 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:17:41.500117 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Dec 16 12:17:41.500249 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Dec 16 12:17:41.500334 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:17:41.500418 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Dec 16 12:17:41.500499 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Dec 16 12:17:41.500580 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:17:41.501310 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Dec 16 12:17:41.501410 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Dec 16 12:17:41.501495 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:17:41.501582 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Dec 16 12:17:41.501903 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Dec 16 12:17:41.501995 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:17:41.502010 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 16 12:17:41.502094 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Dec 16 12:17:41.502233 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Dec 16 12:17:41.502319 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:17:41.502330 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 16 12:17:41.502339 kernel: ACPI: button: Power Button [PWRB] Dec 16 12:17:41.502351 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 16 12:17:41.502439 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 16 12:17:41.502528 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Dec 16 12:17:41.502539 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 12:17:41.502548 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 16 12:17:41.502668 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Dec 16 12:17:41.502681 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Dec 16 12:17:41.502693 kernel: thunder_xcv, ver 1.0 Dec 16 12:17:41.502701 kernel: thunder_bgx, ver 1.0 Dec 16 12:17:41.502709 kernel: nicpf, ver 1.0 Dec 16 12:17:41.502717 kernel: nicvf, ver 1.0 Dec 16 12:17:41.502820 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 16 12:17:41.502899 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-16T12:17:40 UTC (1765887460) Dec 16 12:17:41.502910 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 12:17:41.502920 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 16 12:17:41.502929 kernel: watchdog: NMI not fully supported Dec 16 12:17:41.502937 kernel: watchdog: Hard watchdog permanently disabled Dec 16 12:17:41.502945 kernel: NET: Registered PF_INET6 protocol family Dec 16 12:17:41.502953 kernel: Segment Routing with IPv6 Dec 16 12:17:41.502961 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 12:17:41.502969 kernel: NET: Registered PF_PACKET protocol family Dec 16 12:17:41.502979 kernel: Key type dns_resolver registered Dec 16 12:17:41.502987 kernel: registered taskstats version 1 Dec 16 12:17:41.502995 kernel: Loading compiled-in X.509 certificates Dec 16 12:17:41.503003 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 545838337a91b65b763486e536766b3eec3ef99d' Dec 16 12:17:41.503011 kernel: Demotion targets for Node 0: null Dec 16 12:17:41.503019 kernel: Key type .fscrypt registered Dec 16 12:17:41.503026 kernel: Key type fscrypt-provisioning registered Dec 16 12:17:41.503035 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 12:17:41.503044 kernel: ima: Allocated hash algorithm: sha1 Dec 16 12:17:41.503052 kernel: ima: No architecture policies found Dec 16 12:17:41.503061 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 16 12:17:41.503070 kernel: clk: Disabling unused clocks Dec 16 12:17:41.503078 kernel: PM: genpd: Disabling unused power domains Dec 16 12:17:41.503086 kernel: Freeing unused kernel memory: 12480K Dec 16 12:17:41.503095 kernel: Run /init as init process Dec 16 12:17:41.503103 kernel: with arguments: Dec 16 12:17:41.503111 kernel: /init Dec 16 12:17:41.503119 kernel: with environment: Dec 16 12:17:41.503137 kernel: HOME=/ Dec 16 12:17:41.503147 kernel: TERM=linux Dec 16 12:17:41.503155 kernel: ACPI: bus type USB registered Dec 16 12:17:41.503162 kernel: usbcore: registered new interface driver usbfs Dec 16 12:17:41.503173 kernel: usbcore: registered new interface driver hub Dec 16 12:17:41.503181 kernel: usbcore: registered new device driver usb Dec 16 12:17:41.503279 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:17:41.503364 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 12:17:41.503447 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 12:17:41.503528 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:17:41.503653 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 12:17:41.503742 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 12:17:41.503853 kernel: hub 1-0:1.0: USB hub found Dec 16 12:17:41.503944 kernel: hub 1-0:1.0: 4 ports detected Dec 16 12:17:41.504048 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 12:17:41.504159 kernel: hub 2-0:1.0: USB hub found Dec 16 12:17:41.504257 kernel: hub 2-0:1.0: 4 ports detected Dec 16 12:17:41.504267 kernel: SCSI subsystem initialized Dec 16 12:17:41.504365 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Dec 16 12:17:41.504465 kernel: scsi host0: Virtio SCSI HBA Dec 16 12:17:41.504578 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 16 12:17:41.504821 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Dec 16 12:17:41.504925 kernel: sd 0:0:0:1: Power-on or device reset occurred Dec 16 12:17:41.505018 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Dec 16 12:17:41.505123 kernel: sd 0:0:0:1: [sda] Write Protect is off Dec 16 12:17:41.505265 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Dec 16 12:17:41.505362 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Dec 16 12:17:41.505373 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 12:17:41.505381 kernel: GPT:25804799 != 80003071 Dec 16 12:17:41.505389 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 12:17:41.505397 kernel: GPT:25804799 != 80003071 Dec 16 12:17:41.505405 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 12:17:41.505413 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 16 12:17:41.505501 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Dec 16 12:17:41.505592 kernel: sr 0:0:0:0: Power-on or device reset occurred Dec 16 12:17:41.505716 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Dec 16 12:17:41.505729 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 16 12:17:41.505819 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Dec 16 12:17:41.505830 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 12:17:41.505841 kernel: device-mapper: uevent: version 1.0.3 Dec 16 12:17:41.505849 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 12:17:41.505857 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 16 12:17:41.505865 kernel: raid6: neonx8 gen() 15643 MB/s Dec 16 12:17:41.505873 kernel: raid6: neonx4 gen() 8670 MB/s Dec 16 12:17:41.505881 kernel: raid6: neonx2 gen() 7344 MB/s Dec 16 12:17:41.505889 kernel: raid6: neonx1 gen() 5840 MB/s Dec 16 12:17:41.505898 kernel: raid6: int64x8 gen() 3773 MB/s Dec 16 12:17:41.505906 kernel: raid6: int64x4 gen() 4300 MB/s Dec 16 12:17:41.505914 kernel: raid6: int64x2 gen() 6064 MB/s Dec 16 12:17:41.506021 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 12:17:41.506033 kernel: raid6: int64x1 gen() 4962 MB/s Dec 16 12:17:41.506041 kernel: raid6: using algorithm neonx8 gen() 15643 MB/s Dec 16 12:17:41.506049 kernel: raid6: .... xor() 11908 MB/s, rmw enabled Dec 16 12:17:41.506059 kernel: raid6: using neon recovery algorithm Dec 16 12:17:41.506067 kernel: xor: measuring software checksum speed Dec 16 12:17:41.506075 kernel: 8regs : 21556 MB/sec Dec 16 12:17:41.506083 kernel: 32regs : 21693 MB/sec Dec 16 12:17:41.506091 kernel: arm64_neon : 25449 MB/sec Dec 16 12:17:41.506099 kernel: xor: using function: arm64_neon (25449 MB/sec) Dec 16 12:17:41.506107 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 12:17:41.506117 kernel: BTRFS: device fsid d00a2bc5-1c68-4957-aa37-d070193fcf05 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (212) Dec 16 12:17:41.506125 kernel: BTRFS info (device dm-0): first mount of filesystem d00a2bc5-1c68-4957-aa37-d070193fcf05 Dec 16 12:17:41.506145 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:17:41.506153 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 16 12:17:41.506162 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 12:17:41.506170 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 12:17:41.506178 kernel: loop: module loaded Dec 16 12:17:41.506188 kernel: loop0: detected capacity change from 0 to 91832 Dec 16 12:17:41.506197 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 12:17:41.506309 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 16 12:17:41.506322 systemd[1]: Successfully made /usr/ read-only. Dec 16 12:17:41.506334 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:17:41.506345 systemd[1]: Detected virtualization kvm. Dec 16 12:17:41.506353 systemd[1]: Detected architecture arm64. Dec 16 12:17:41.506361 systemd[1]: Running in initrd. Dec 16 12:17:41.506370 systemd[1]: No hostname configured, using default hostname. Dec 16 12:17:41.506379 systemd[1]: Hostname set to . Dec 16 12:17:41.506387 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:17:41.506395 systemd[1]: Queued start job for default target initrd.target. Dec 16 12:17:41.506406 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:17:41.506415 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:17:41.506423 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:17:41.506433 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 12:17:41.506441 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:17:41.506451 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 12:17:41.506461 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 12:17:41.506470 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:17:41.506478 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:17:41.506487 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:17:41.506495 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:17:41.506504 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:17:41.506514 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:17:41.506522 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:17:41.506530 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:17:41.506539 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:17:41.506548 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:17:41.506556 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 12:17:41.506565 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 12:17:41.506575 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:17:41.506583 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:17:41.506592 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:17:41.506601 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:17:41.506609 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 12:17:41.506631 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 12:17:41.506641 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:17:41.506652 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 12:17:41.506673 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 12:17:41.506681 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 12:17:41.506690 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:17:41.506698 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:17:41.506709 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:17:41.506718 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 12:17:41.506727 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:17:41.506736 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 12:17:41.506745 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:17:41.506755 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 12:17:41.506787 systemd-journald[348]: Collecting audit messages is enabled. Dec 16 12:17:41.506808 kernel: Bridge firewalling registered Dec 16 12:17:41.506819 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:17:41.506828 kernel: audit: type=1130 audit(1765887461.463:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.506836 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:17:41.506845 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:17:41.506854 kernel: audit: type=1130 audit(1765887461.476:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.506863 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:17:41.506873 kernel: audit: type=1130 audit(1765887461.483:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.506882 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 12:17:41.506891 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:17:41.506900 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:17:41.506908 kernel: audit: type=1130 audit(1765887461.501:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.506916 kernel: audit: type=1334 audit(1765887461.504:6): prog-id=6 op=LOAD Dec 16 12:17:41.506926 systemd-journald[348]: Journal started Dec 16 12:17:41.506946 systemd-journald[348]: Runtime Journal (/run/log/journal/f9f1d3aecfed44b49c7ab89853b071fa) is 8M, max 76.5M, 68.5M free. Dec 16 12:17:41.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.476000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.483000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.501000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.504000 audit: BPF prog-id=6 op=LOAD Dec 16 12:17:41.459294 systemd-modules-load[350]: Inserted module 'br_netfilter' Dec 16 12:17:41.509645 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:17:41.514640 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:17:41.513000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.517725 kernel: audit: type=1130 audit(1765887461.513:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.521521 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:17:41.526613 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:17:41.527000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.530923 kernel: audit: type=1130 audit(1765887461.527:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.534939 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:17:41.535000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.537800 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 12:17:41.539012 kernel: audit: type=1130 audit(1765887461.535:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.542120 systemd-tmpfiles[377]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 12:17:41.552236 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:17:41.552000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.557647 kernel: audit: type=1130 audit(1765887461.552:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.580585 systemd-resolved[366]: Positive Trust Anchors: Dec 16 12:17:41.581283 systemd-resolved[366]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:17:41.584795 dracut-cmdline[388]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 12:17:41.581287 systemd-resolved[366]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:17:41.581321 systemd-resolved[366]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:17:41.618520 systemd-resolved[366]: Defaulting to hostname 'linux'. Dec 16 12:17:41.620343 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:17:41.620000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.621539 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:17:41.682642 kernel: Loading iSCSI transport class v2.0-870. Dec 16 12:17:41.693661 kernel: iscsi: registered transport (tcp) Dec 16 12:17:41.708701 kernel: iscsi: registered transport (qla4xxx) Dec 16 12:17:41.708764 kernel: QLogic iSCSI HBA Driver Dec 16 12:17:41.738685 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:17:41.771097 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:17:41.771000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.774047 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:17:41.827783 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 12:17:41.827000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.829398 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 12:17:41.830712 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 12:17:41.872897 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:17:41.872000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.873000 audit: BPF prog-id=7 op=LOAD Dec 16 12:17:41.873000 audit: BPF prog-id=8 op=LOAD Dec 16 12:17:41.875516 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:17:41.907834 systemd-udevd[630]: Using default interface naming scheme 'v257'. Dec 16 12:17:41.917972 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:17:41.918000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.919681 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 12:17:41.951739 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:17:41.951000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:41.952000 audit: BPF prog-id=9 op=LOAD Dec 16 12:17:41.953829 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:17:41.955354 dracut-pre-trigger[696]: rd.md=0: removing MD RAID activation Dec 16 12:17:41.996949 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:17:41.997000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:42.001762 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:17:42.010824 systemd-networkd[740]: lo: Link UP Dec 16 12:17:42.010828 systemd-networkd[740]: lo: Gained carrier Dec 16 12:17:42.012556 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:17:42.013000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:42.013853 systemd[1]: Reached target network.target - Network. Dec 16 12:17:42.075306 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:17:42.076000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:42.081085 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 12:17:42.206325 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Dec 16 12:17:42.222080 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Dec 16 12:17:42.242219 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Dec 16 12:17:42.244470 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 12:17:42.256648 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 16 12:17:42.259638 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 12:17:42.267202 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 16 12:17:42.266206 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 12:17:42.283010 disk-uuid[805]: Primary Header is updated. Dec 16 12:17:42.283010 disk-uuid[805]: Secondary Entries is updated. Dec 16 12:17:42.283010 disk-uuid[805]: Secondary Header is updated. Dec 16 12:17:42.314043 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 16 12:17:42.314316 kernel: usbcore: registered new interface driver usbhid Dec 16 12:17:42.314329 kernel: usbhid: USB HID core driver Dec 16 12:17:42.315805 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:17:42.315928 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:17:42.318577 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:17:42.317000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:42.323481 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:17:42.325465 systemd-networkd[740]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:17:42.325469 systemd-networkd[740]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:17:42.327777 systemd-networkd[740]: eth1: Link UP Dec 16 12:17:42.328256 systemd-networkd[740]: eth1: Gained carrier Dec 16 12:17:42.328269 systemd-networkd[740]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:17:42.340594 systemd-networkd[740]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:17:42.340603 systemd-networkd[740]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:17:42.340930 systemd-networkd[740]: eth0: Link UP Dec 16 12:17:42.341828 systemd-networkd[740]: eth0: Gained carrier Dec 16 12:17:42.341840 systemd-networkd[740]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:17:42.366000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:42.366783 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:17:42.385753 systemd-networkd[740]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 12:17:42.396938 systemd-networkd[740]: eth0: DHCPv4 address 46.224.130.63/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 12:17:42.402556 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 12:17:42.403000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:42.404059 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:17:42.405577 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:17:42.407428 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:17:42.410274 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 12:17:42.441405 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:17:42.443000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.340709 disk-uuid[807]: Warning: The kernel is still using the old partition table. Dec 16 12:17:43.340709 disk-uuid[807]: The new table will be used at the next reboot or after you Dec 16 12:17:43.340709 disk-uuid[807]: run partprobe(8) or kpartx(8) Dec 16 12:17:43.340709 disk-uuid[807]: The operation has completed successfully. Dec 16 12:17:43.351382 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 12:17:43.351000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.351000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.351495 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 12:17:43.354778 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 12:17:43.388654 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (839) Dec 16 12:17:43.390644 kernel: BTRFS info (device sda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:17:43.390701 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:17:43.394246 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:17:43.394303 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:17:43.394315 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:17:43.400658 kernel: BTRFS info (device sda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:17:43.401700 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 12:17:43.401000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.403965 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 12:17:43.528319 ignition[858]: Ignition 2.24.0 Dec 16 12:17:43.529520 ignition[858]: Stage: fetch-offline Dec 16 12:17:43.529582 ignition[858]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:17:43.529596 ignition[858]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:17:43.529774 ignition[858]: parsed url from cmdline: "" Dec 16 12:17:43.529778 ignition[858]: no config URL provided Dec 16 12:17:43.530298 ignition[858]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:17:43.530313 ignition[858]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:17:43.530319 ignition[858]: failed to fetch config: resource requires networking Dec 16 12:17:43.535000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.534497 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:17:43.532851 ignition[858]: Ignition finished successfully Dec 16 12:17:43.536376 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 12:17:43.565428 ignition[867]: Ignition 2.24.0 Dec 16 12:17:43.565446 ignition[867]: Stage: fetch Dec 16 12:17:43.565601 ignition[867]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:17:43.565609 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:17:43.565716 ignition[867]: parsed url from cmdline: "" Dec 16 12:17:43.565720 ignition[867]: no config URL provided Dec 16 12:17:43.565727 ignition[867]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:17:43.565733 ignition[867]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:17:43.565761 ignition[867]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Dec 16 12:17:43.573068 ignition[867]: GET result: OK Dec 16 12:17:43.574004 ignition[867]: parsing config with SHA512: 7fa0ab7a3c14a95845f66d281b29f4ca4d6d0f872a623881d5bb5a9c77c90c4bd11b9465d95b477ecb55e5a74583425a5d20c01e797ea03c77f8076aa0a47fd1 Dec 16 12:17:43.581001 unknown[867]: fetched base config from "system" Dec 16 12:17:43.581441 ignition[867]: fetch: fetch complete Dec 16 12:17:43.581012 unknown[867]: fetched base config from "system" Dec 16 12:17:43.581447 ignition[867]: fetch: fetch passed Dec 16 12:17:43.581017 unknown[867]: fetched user config from "hetzner" Dec 16 12:17:43.581505 ignition[867]: Ignition finished successfully Dec 16 12:17:43.584000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.584248 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 12:17:43.587676 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 12:17:43.612548 ignition[873]: Ignition 2.24.0 Dec 16 12:17:43.612569 ignition[873]: Stage: kargs Dec 16 12:17:43.612773 ignition[873]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:17:43.612785 ignition[873]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:17:43.613670 ignition[873]: kargs: kargs passed Dec 16 12:17:43.613724 ignition[873]: Ignition finished successfully Dec 16 12:17:43.617497 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 12:17:43.618000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.621220 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 12:17:43.650795 ignition[880]: Ignition 2.24.0 Dec 16 12:17:43.650882 ignition[880]: Stage: disks Dec 16 12:17:43.651037 ignition[880]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:17:43.651046 ignition[880]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:17:43.651882 ignition[880]: disks: disks passed Dec 16 12:17:43.654472 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 12:17:43.651936 ignition[880]: Ignition finished successfully Dec 16 12:17:43.655000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.656263 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 12:17:43.658246 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 12:17:43.659000 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:17:43.660738 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:17:43.662221 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:17:43.664754 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 12:17:43.706760 systemd-fsck[888]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Dec 16 12:17:43.711881 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 12:17:43.712000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.715768 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 12:17:43.794685 kernel: EXT4-fs (sda9): mounted filesystem 0e69f709-36a9-4e15-b0c9-c7e150185653 r/w with ordered data mode. Quota mode: none. Dec 16 12:17:43.794777 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 12:17:43.795892 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 12:17:43.798490 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:17:43.801184 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 12:17:43.819546 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 16 12:17:43.822849 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 12:17:43.822916 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:17:43.829099 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 12:17:43.834893 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 12:17:43.841641 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (896) Dec 16 12:17:43.844151 kernel: BTRFS info (device sda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:17:43.844214 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:17:43.853394 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:17:43.853480 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:17:43.853592 systemd-networkd[740]: eth0: Gained IPv6LL Dec 16 12:17:43.853911 systemd-networkd[740]: eth1: Gained IPv6LL Dec 16 12:17:43.856992 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:17:43.861468 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:17:43.898280 coreos-metadata[898]: Dec 16 12:17:43.897 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Dec 16 12:17:43.901061 coreos-metadata[898]: Dec 16 12:17:43.901 INFO Fetch successful Dec 16 12:17:43.901061 coreos-metadata[898]: Dec 16 12:17:43.901 INFO wrote hostname ci-4547-0-0-5-8fe0b910ae to /sysroot/etc/hostname Dec 16 12:17:43.903985 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 12:17:43.905000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:44.019341 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 12:17:44.020000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:44.022826 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 12:17:44.025567 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 12:17:44.046654 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 12:17:44.049697 kernel: BTRFS info (device sda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:17:44.072905 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 12:17:44.072000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:44.080656 ignition[997]: INFO : Ignition 2.24.0 Dec 16 12:17:44.080656 ignition[997]: INFO : Stage: mount Dec 16 12:17:44.080656 ignition[997]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:17:44.080656 ignition[997]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:17:44.083698 ignition[997]: INFO : mount: mount passed Dec 16 12:17:44.083698 ignition[997]: INFO : Ignition finished successfully Dec 16 12:17:44.087685 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 12:17:44.087000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:44.091737 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 12:17:44.798613 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:17:44.825755 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1008) Dec 16 12:17:44.825829 kernel: BTRFS info (device sda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:17:44.826794 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:17:44.832040 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:17:44.832134 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:17:44.832153 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:17:44.834551 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:17:44.865849 ignition[1025]: INFO : Ignition 2.24.0 Dec 16 12:17:44.865849 ignition[1025]: INFO : Stage: files Dec 16 12:17:44.867186 ignition[1025]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:17:44.867186 ignition[1025]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:17:44.867186 ignition[1025]: DEBUG : files: compiled without relabeling support, skipping Dec 16 12:17:44.869890 ignition[1025]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 12:17:44.869890 ignition[1025]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 12:17:44.873321 ignition[1025]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 12:17:44.874554 ignition[1025]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 12:17:44.876603 ignition[1025]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 12:17:44.876512 unknown[1025]: wrote ssh authorized keys file for user: core Dec 16 12:17:44.883455 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:17:44.883455 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 16 12:17:44.949354 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 12:17:45.027360 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:17:45.027360 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 12:17:45.030292 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 12:17:45.030292 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:17:45.030292 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:17:45.030292 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:17:45.030292 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:17:45.030292 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:17:45.030292 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:17:45.039350 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:17:45.039350 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:17:45.039350 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:17:45.039350 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:17:45.039350 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:17:45.039350 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Dec 16 12:17:45.186414 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 12:17:45.765432 ignition[1025]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:17:45.765432 ignition[1025]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 12:17:45.768687 ignition[1025]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:17:45.771434 ignition[1025]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:17:45.771434 ignition[1025]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 12:17:45.771434 ignition[1025]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Dec 16 12:17:45.775398 ignition[1025]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 12:17:45.775398 ignition[1025]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 12:17:45.775398 ignition[1025]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Dec 16 12:17:45.775398 ignition[1025]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Dec 16 12:17:45.775398 ignition[1025]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 12:17:45.775398 ignition[1025]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:17:45.775398 ignition[1025]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:17:45.775398 ignition[1025]: INFO : files: files passed Dec 16 12:17:45.775398 ignition[1025]: INFO : Ignition finished successfully Dec 16 12:17:45.791141 kernel: kauditd_printk_skb: 28 callbacks suppressed Dec 16 12:17:45.791252 kernel: audit: type=1130 audit(1765887465.776:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.776000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.775019 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 12:17:45.782140 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 12:17:45.786900 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 12:17:45.808308 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 12:17:45.808727 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 12:17:45.813331 kernel: audit: type=1130 audit(1765887465.809:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.813361 kernel: audit: type=1131 audit(1765887465.811:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.809000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.811000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.821767 initrd-setup-root-after-ignition[1056]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:17:45.821767 initrd-setup-root-after-ignition[1056]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:17:45.824351 initrd-setup-root-after-ignition[1060]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:17:45.825875 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:17:45.826000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.826841 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 12:17:45.829926 kernel: audit: type=1130 audit(1765887465.826:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.831001 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 12:17:45.893496 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 12:17:45.893715 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 12:17:45.895987 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 12:17:45.901076 kernel: audit: type=1130 audit(1765887465.895:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.901114 kernel: audit: type=1131 audit(1765887465.895:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.895000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.895000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.899811 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 12:17:45.901898 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 12:17:45.902834 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 12:17:45.927512 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:17:45.929000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.931634 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 12:17:45.934648 kernel: audit: type=1130 audit(1765887465.929:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.960292 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:17:45.960565 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:17:45.962901 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:17:45.964476 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 12:17:45.965673 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 12:17:45.966320 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:17:45.967000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.970011 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 12:17:45.971159 kernel: audit: type=1131 audit(1765887465.967:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.971364 systemd[1]: Stopped target basic.target - Basic System. Dec 16 12:17:45.972443 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 12:17:45.973732 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:17:45.976394 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 12:17:45.977552 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:17:45.979294 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 12:17:45.980650 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:17:45.981924 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 12:17:45.982956 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 12:17:45.983801 systemd[1]: Stopped target swap.target - Swaps. Dec 16 12:17:45.984570 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 12:17:45.984784 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:17:45.984000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.987019 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:17:45.988420 kernel: audit: type=1131 audit(1765887465.984:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.988051 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:17:45.989163 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 12:17:45.990227 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:17:45.991672 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 12:17:45.991824 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 12:17:45.992000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.995914 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 12:17:45.997800 kernel: audit: type=1131 audit(1765887465.992:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.996129 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:17:45.996000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.997000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.997134 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 12:17:45.997254 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 12:17:45.999000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:45.998568 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 16 12:17:45.998722 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 12:17:46.001213 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 12:17:46.002561 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 12:17:46.002745 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:17:46.005000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.006896 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 12:17:46.007370 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 12:17:46.007497 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:17:46.010000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.012747 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 12:17:46.012897 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:17:46.013000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.014717 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 12:17:46.014849 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:17:46.016000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.023347 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 12:17:46.024000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.024000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.024672 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 12:17:46.036504 ignition[1080]: INFO : Ignition 2.24.0 Dec 16 12:17:46.036504 ignition[1080]: INFO : Stage: umount Dec 16 12:17:46.037591 ignition[1080]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:17:46.037591 ignition[1080]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:17:46.037591 ignition[1080]: INFO : umount: umount passed Dec 16 12:17:46.042000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.043000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.044823 ignition[1080]: INFO : Ignition finished successfully Dec 16 12:17:46.044000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.038708 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 12:17:46.049000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.040270 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 12:17:46.040409 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 12:17:46.043022 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 12:17:46.052000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.043073 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 12:17:46.044074 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 12:17:46.044157 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 12:17:46.045500 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 12:17:46.045558 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 12:17:46.049309 systemd[1]: Stopped target network.target - Network. Dec 16 12:17:46.050303 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 12:17:46.050388 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:17:46.052389 systemd[1]: Stopped target paths.target - Path Units. Dec 16 12:17:46.073000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.056253 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 12:17:46.075000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.061568 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:17:46.062442 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 12:17:46.064878 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 12:17:46.065930 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 12:17:46.065977 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:17:46.081000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.067438 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 12:17:46.067469 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:17:46.082000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.073270 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Dec 16 12:17:46.073302 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:17:46.073939 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 12:17:46.073993 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 12:17:46.074612 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 12:17:46.074673 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 12:17:46.076077 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 12:17:46.079675 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 12:17:46.088000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.080874 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 12:17:46.080978 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 12:17:46.082293 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 12:17:46.090000 audit: BPF prog-id=6 op=UNLOAD Dec 16 12:17:46.082393 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 12:17:46.087935 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 12:17:46.088076 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 12:17:46.092936 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 12:17:46.093315 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 12:17:46.094000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.096159 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 12:17:46.096000 audit: BPF prog-id=9 op=UNLOAD Dec 16 12:17:46.097084 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 12:17:46.097138 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:17:46.099743 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 12:17:46.100287 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 12:17:46.100356 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:17:46.103000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.103743 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 12:17:46.103823 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:17:46.106798 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 12:17:46.106000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.108000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.106849 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 12:17:46.109755 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:17:46.127228 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 12:17:46.129035 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:17:46.130000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.133255 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 12:17:46.133970 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 12:17:46.135000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.135334 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 12:17:46.135383 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:17:46.135996 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 12:17:46.136052 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:17:46.138851 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 12:17:46.139446 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 12:17:46.140000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.140942 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 12:17:46.141508 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:17:46.142000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.144498 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 12:17:46.145840 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 12:17:46.146513 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:17:46.147000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.148233 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 12:17:46.149000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.148293 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:17:46.149738 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:17:46.151000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.149786 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:17:46.152493 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 12:17:46.154970 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 12:17:46.155000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.164586 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 12:17:46.166030 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 12:17:46.166000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.166000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.168174 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 12:17:46.171838 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 12:17:46.198649 systemd[1]: Switching root. Dec 16 12:17:46.243801 systemd-journald[348]: Journal stopped Dec 16 12:17:47.221046 systemd-journald[348]: Received SIGTERM from PID 1 (systemd). Dec 16 12:17:47.221118 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 12:17:47.221132 kernel: SELinux: policy capability open_perms=1 Dec 16 12:17:47.221150 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 12:17:47.221160 kernel: SELinux: policy capability always_check_network=0 Dec 16 12:17:47.221170 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 12:17:47.221181 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 12:17:47.221191 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 12:17:47.221200 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 12:17:47.221214 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 12:17:47.221225 systemd[1]: Successfully loaded SELinux policy in 52.370ms. Dec 16 12:17:47.221253 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.418ms. Dec 16 12:17:47.221265 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:17:47.221276 systemd[1]: Detected virtualization kvm. Dec 16 12:17:47.221290 systemd[1]: Detected architecture arm64. Dec 16 12:17:47.221301 systemd[1]: Detected first boot. Dec 16 12:17:47.221313 systemd[1]: Hostname set to . Dec 16 12:17:47.221324 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:17:47.221335 zram_generator::config[1124]: No configuration found. Dec 16 12:17:47.221349 kernel: NET: Registered PF_VSOCK protocol family Dec 16 12:17:47.221359 systemd[1]: Populated /etc with preset unit settings. Dec 16 12:17:47.221371 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 12:17:47.221382 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 12:17:47.221398 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 12:17:47.221413 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 12:17:47.221428 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 12:17:47.221441 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 12:17:47.221452 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 12:17:47.221466 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 12:17:47.221478 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 12:17:47.221490 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 12:17:47.221501 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 12:17:47.221511 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:17:47.221522 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:17:47.221533 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 12:17:47.221544 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 12:17:47.221555 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 12:17:47.221568 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:17:47.221579 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 16 12:17:47.221591 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:17:47.221602 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:17:47.221613 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 12:17:47.228706 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 12:17:47.228732 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 12:17:47.228745 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 12:17:47.228757 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:17:47.228768 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:17:47.228780 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Dec 16 12:17:47.228792 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:17:47.228812 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:17:47.228824 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 12:17:47.228834 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 12:17:47.228845 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 12:17:47.228856 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:17:47.228867 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Dec 16 12:17:47.228878 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:17:47.228891 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Dec 16 12:17:47.228903 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Dec 16 12:17:47.228915 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:17:47.228926 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:17:47.228937 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 12:17:47.228949 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 12:17:47.228960 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 12:17:47.228974 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 12:17:47.228985 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 12:17:47.228996 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 12:17:47.229012 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 12:17:47.229025 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 12:17:47.229036 systemd[1]: Reached target machines.target - Containers. Dec 16 12:17:47.229047 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 12:17:47.229059 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:17:47.229070 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:17:47.229098 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 12:17:47.229112 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:17:47.229123 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:17:47.229134 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:17:47.229145 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 12:17:47.229159 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:17:47.229171 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 12:17:47.229181 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 12:17:47.229194 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 12:17:47.229208 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 12:17:47.229219 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 12:17:47.229231 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:17:47.229242 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:17:47.229253 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:17:47.229263 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:17:47.229277 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 12:17:47.229288 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 12:17:47.229301 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:17:47.229312 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 12:17:47.229323 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 12:17:47.229334 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 12:17:47.229347 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 12:17:47.229358 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 12:17:47.229369 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 12:17:47.229380 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:17:47.229392 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 12:17:47.229405 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 12:17:47.229416 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:17:47.229426 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:17:47.229437 kernel: fuse: init (API version 7.41) Dec 16 12:17:47.229449 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:17:47.229461 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:17:47.229472 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:17:47.229486 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:17:47.229497 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 12:17:47.229508 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 12:17:47.229519 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:17:47.229530 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:17:47.229544 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 12:17:47.229556 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 12:17:47.229568 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:17:47.229579 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Dec 16 12:17:47.229591 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 12:17:47.229602 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:17:47.229612 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 12:17:47.229643 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:17:47.229659 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:17:47.229672 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 12:17:47.229684 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:17:47.229695 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 12:17:47.229706 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:17:47.229750 systemd-journald[1188]: Collecting audit messages is enabled. Dec 16 12:17:47.229776 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:17:47.229789 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 12:17:47.229801 systemd-journald[1188]: Journal started Dec 16 12:17:47.229824 systemd-journald[1188]: Runtime Journal (/run/log/journal/f9f1d3aecfed44b49c7ab89853b071fa) is 8M, max 76.5M, 68.5M free. Dec 16 12:17:47.236676 kernel: ACPI: bus type drm_connector registered Dec 16 12:17:47.238918 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:17:46.955000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Dec 16 12:17:47.063000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.064000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.066000 audit: BPF prog-id=14 op=UNLOAD Dec 16 12:17:47.066000 audit: BPF prog-id=13 op=UNLOAD Dec 16 12:17:47.072000 audit: BPF prog-id=15 op=LOAD Dec 16 12:17:47.073000 audit: BPF prog-id=16 op=LOAD Dec 16 12:17:47.073000 audit: BPF prog-id=17 op=LOAD Dec 16 12:17:47.133000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.138000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.138000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.151000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.151000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.159000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.159000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.167000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.167000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.173000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.173000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.175000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.179000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.183000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.186000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.214000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Dec 16 12:17:47.214000 audit[1188]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=6 a1=ffffe5f75d00 a2=4000 a3=0 items=0 ppid=1 pid=1188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:47.214000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Dec 16 12:17:47.237000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.880202 systemd[1]: Queued start job for default target multi-user.target. Dec 16 12:17:46.905298 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 16 12:17:46.906231 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 12:17:47.252000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.252000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.262000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.241247 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 12:17:47.250020 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:17:47.252713 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:17:47.262697 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 12:17:47.263572 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 12:17:47.272849 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 12:17:47.286765 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:17:47.287000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.298988 kernel: loop1: detected capacity change from 0 to 8 Dec 16 12:17:47.315366 systemd-journald[1188]: Time spent on flushing to /var/log/journal/f9f1d3aecfed44b49c7ab89853b071fa is 33.368ms for 1290 entries. Dec 16 12:17:47.315366 systemd-journald[1188]: System Journal (/var/log/journal/f9f1d3aecfed44b49c7ab89853b071fa) is 8M, max 588.1M, 580.1M free. Dec 16 12:17:47.362143 systemd-journald[1188]: Received client request to flush runtime journal. Dec 16 12:17:47.362203 kernel: loop2: detected capacity change from 0 to 100192 Dec 16 12:17:47.362218 kernel: loop3: detected capacity change from 0 to 45344 Dec 16 12:17:47.318000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.318704 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:17:47.320757 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 12:17:47.333896 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 12:17:47.340898 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 12:17:47.366657 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 12:17:47.367000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.383653 kernel: loop4: detected capacity change from 0 to 200800 Dec 16 12:17:47.395772 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 12:17:47.396000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.397000 audit: BPF prog-id=18 op=LOAD Dec 16 12:17:47.397000 audit: BPF prog-id=19 op=LOAD Dec 16 12:17:47.397000 audit: BPF prog-id=20 op=LOAD Dec 16 12:17:47.399269 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Dec 16 12:17:47.400000 audit: BPF prog-id=21 op=LOAD Dec 16 12:17:47.403921 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:17:47.405759 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:17:47.415000 audit: BPF prog-id=22 op=LOAD Dec 16 12:17:47.416000 audit: BPF prog-id=23 op=LOAD Dec 16 12:17:47.416000 audit: BPF prog-id=24 op=LOAD Dec 16 12:17:47.418875 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Dec 16 12:17:47.419000 audit: BPF prog-id=25 op=LOAD Dec 16 12:17:47.420000 audit: BPF prog-id=26 op=LOAD Dec 16 12:17:47.421000 audit: BPF prog-id=27 op=LOAD Dec 16 12:17:47.422332 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 12:17:47.427746 kernel: loop5: detected capacity change from 0 to 8 Dec 16 12:17:47.436675 kernel: loop6: detected capacity change from 0 to 100192 Dec 16 12:17:47.458649 kernel: loop7: detected capacity change from 0 to 45344 Dec 16 12:17:47.467961 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 12:17:47.469000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.475439 systemd-tmpfiles[1261]: ACLs are not supported, ignoring. Dec 16 12:17:47.475760 kernel: loop1: detected capacity change from 0 to 200800 Dec 16 12:17:47.475459 systemd-tmpfiles[1261]: ACLs are not supported, ignoring. Dec 16 12:17:47.486000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.485833 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:17:47.496227 (sd-merge)[1264]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Dec 16 12:17:47.501451 (sd-merge)[1264]: Merged extensions into '/usr'. Dec 16 12:17:47.509691 systemd-nsresourced[1262]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Dec 16 12:17:47.509845 systemd[1]: Reload requested from client PID 1216 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 12:17:47.509860 systemd[1]: Reloading... Dec 16 12:17:47.632648 zram_generator::config[1308]: No configuration found. Dec 16 12:17:47.655900 systemd-oomd[1259]: No swap; memory pressure usage will be degraded Dec 16 12:17:47.696420 systemd-resolved[1260]: Positive Trust Anchors: Dec 16 12:17:47.696768 systemd-resolved[1260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:17:47.696774 systemd-resolved[1260]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:17:47.696808 systemd-resolved[1260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:17:47.708212 systemd-resolved[1260]: Using system hostname 'ci-4547-0-0-5-8fe0b910ae'. Dec 16 12:17:47.851184 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 12:17:47.851450 systemd[1]: Reloading finished in 341 ms. Dec 16 12:17:47.877665 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Dec 16 12:17:47.878000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.879858 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Dec 16 12:17:47.879000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.880869 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:17:47.880000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.883661 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 12:17:47.883000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.887648 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:17:47.895850 systemd[1]: Starting ensure-sysext.service... Dec 16 12:17:47.898979 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:17:47.901000 audit: BPF prog-id=28 op=LOAD Dec 16 12:17:47.901000 audit: BPF prog-id=22 op=UNLOAD Dec 16 12:17:47.901000 audit: BPF prog-id=29 op=LOAD Dec 16 12:17:47.901000 audit: BPF prog-id=30 op=LOAD Dec 16 12:17:47.901000 audit: BPF prog-id=23 op=UNLOAD Dec 16 12:17:47.901000 audit: BPF prog-id=24 op=UNLOAD Dec 16 12:17:47.902000 audit: BPF prog-id=31 op=LOAD Dec 16 12:17:47.903000 audit: BPF prog-id=15 op=UNLOAD Dec 16 12:17:47.903000 audit: BPF prog-id=32 op=LOAD Dec 16 12:17:47.903000 audit: BPF prog-id=33 op=LOAD Dec 16 12:17:47.903000 audit: BPF prog-id=16 op=UNLOAD Dec 16 12:17:47.903000 audit: BPF prog-id=17 op=UNLOAD Dec 16 12:17:47.903000 audit: BPF prog-id=34 op=LOAD Dec 16 12:17:47.903000 audit: BPF prog-id=25 op=UNLOAD Dec 16 12:17:47.905000 audit: BPF prog-id=35 op=LOAD Dec 16 12:17:47.905000 audit: BPF prog-id=36 op=LOAD Dec 16 12:17:47.905000 audit: BPF prog-id=26 op=UNLOAD Dec 16 12:17:47.905000 audit: BPF prog-id=27 op=UNLOAD Dec 16 12:17:47.906000 audit: BPF prog-id=37 op=LOAD Dec 16 12:17:47.906000 audit: BPF prog-id=21 op=UNLOAD Dec 16 12:17:47.907000 audit: BPF prog-id=38 op=LOAD Dec 16 12:17:47.907000 audit: BPF prog-id=18 op=UNLOAD Dec 16 12:17:47.907000 audit: BPF prog-id=39 op=LOAD Dec 16 12:17:47.908000 audit: BPF prog-id=40 op=LOAD Dec 16 12:17:47.908000 audit: BPF prog-id=19 op=UNLOAD Dec 16 12:17:47.908000 audit: BPF prog-id=20 op=UNLOAD Dec 16 12:17:47.921578 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 12:17:47.927547 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 12:17:47.933855 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 12:17:47.933891 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 12:17:47.935145 systemd[1]: Reload requested from client PID 1344 ('systemctl') (unit ensure-sysext.service)... Dec 16 12:17:47.935262 systemd[1]: Reloading... Dec 16 12:17:47.935987 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 12:17:47.938148 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Dec 16 12:17:47.938216 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Dec 16 12:17:47.952271 systemd-tmpfiles[1345]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:17:47.952287 systemd-tmpfiles[1345]: Skipping /boot Dec 16 12:17:47.965524 systemd-tmpfiles[1345]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:17:47.965683 systemd-tmpfiles[1345]: Skipping /boot Dec 16 12:17:48.030666 zram_generator::config[1388]: No configuration found. Dec 16 12:17:48.189801 systemd[1]: Reloading finished in 254 ms. Dec 16 12:17:48.218354 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 12:17:48.218000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.221000 audit: BPF prog-id=41 op=LOAD Dec 16 12:17:48.221000 audit: BPF prog-id=38 op=UNLOAD Dec 16 12:17:48.221000 audit: BPF prog-id=42 op=LOAD Dec 16 12:17:48.222000 audit: BPF prog-id=43 op=LOAD Dec 16 12:17:48.222000 audit: BPF prog-id=39 op=UNLOAD Dec 16 12:17:48.222000 audit: BPF prog-id=40 op=UNLOAD Dec 16 12:17:48.223000 audit: BPF prog-id=44 op=LOAD Dec 16 12:17:48.223000 audit: BPF prog-id=31 op=UNLOAD Dec 16 12:17:48.223000 audit: BPF prog-id=45 op=LOAD Dec 16 12:17:48.223000 audit: BPF prog-id=46 op=LOAD Dec 16 12:17:48.223000 audit: BPF prog-id=32 op=UNLOAD Dec 16 12:17:48.223000 audit: BPF prog-id=33 op=UNLOAD Dec 16 12:17:48.223000 audit: BPF prog-id=47 op=LOAD Dec 16 12:17:48.223000 audit: BPF prog-id=28 op=UNLOAD Dec 16 12:17:48.223000 audit: BPF prog-id=48 op=LOAD Dec 16 12:17:48.223000 audit: BPF prog-id=49 op=LOAD Dec 16 12:17:48.223000 audit: BPF prog-id=29 op=UNLOAD Dec 16 12:17:48.223000 audit: BPF prog-id=30 op=UNLOAD Dec 16 12:17:48.224000 audit: BPF prog-id=50 op=LOAD Dec 16 12:17:48.224000 audit: BPF prog-id=37 op=UNLOAD Dec 16 12:17:48.224000 audit: BPF prog-id=51 op=LOAD Dec 16 12:17:48.224000 audit: BPF prog-id=34 op=UNLOAD Dec 16 12:17:48.224000 audit: BPF prog-id=52 op=LOAD Dec 16 12:17:48.225000 audit: BPF prog-id=53 op=LOAD Dec 16 12:17:48.225000 audit: BPF prog-id=35 op=UNLOAD Dec 16 12:17:48.225000 audit: BPF prog-id=36 op=UNLOAD Dec 16 12:17:48.228662 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:17:48.228000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.230520 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 12:17:48.231608 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 12:17:48.242214 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:17:48.246938 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 12:17:48.249274 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 12:17:48.253896 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 12:17:48.254000 audit: BPF prog-id=54 op=LOAD Dec 16 12:17:48.254000 audit: BPF prog-id=7 op=UNLOAD Dec 16 12:17:48.254000 audit: BPF prog-id=8 op=UNLOAD Dec 16 12:17:48.255000 audit: BPF prog-id=55 op=LOAD Dec 16 12:17:48.257309 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:17:48.261209 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 12:17:48.265366 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:17:48.266575 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:17:48.274060 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:17:48.277285 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:17:48.278702 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:17:48.278923 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:17:48.279019 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:17:48.283977 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:17:48.284194 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:17:48.284338 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:17:48.284418 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:17:48.291487 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:17:48.302126 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:17:48.302802 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:17:48.302967 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:17:48.303051 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:17:48.308999 systemd[1]: Finished ensure-sysext.service. Dec 16 12:17:48.308000 audit[1425]: SYSTEM_BOOT pid=1425 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.309000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.313000 audit: BPF prog-id=56 op=LOAD Dec 16 12:17:48.316499 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 16 12:17:48.330875 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 12:17:48.331000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.352521 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:17:48.352776 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:17:48.352000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.352000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.365426 systemd-udevd[1424]: Using default interface naming scheme 'v257'. Dec 16 12:17:48.366750 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:17:48.366000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.367004 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:17:48.369000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.370870 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:17:48.374927 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:17:48.378989 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:17:48.379000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.379000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.380687 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:17:48.384963 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:17:48.385213 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:17:48.385000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.385000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.419901 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 12:17:48.420000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.427934 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 16 12:17:48.428000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-timesyncd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:48.429729 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 12:17:48.430000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Dec 16 12:17:48.430000 audit[1460]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffd6793080 a2=420 a3=0 items=0 ppid=1420 pid=1460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:48.430000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:17:48.431171 augenrules[1460]: No rules Dec 16 12:17:48.433478 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:17:48.434548 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:17:48.443725 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:17:48.451312 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:17:48.466268 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 12:17:48.468561 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 12:17:48.563378 systemd-networkd[1470]: lo: Link UP Dec 16 12:17:48.563392 systemd-networkd[1470]: lo: Gained carrier Dec 16 12:17:48.580148 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:17:48.581805 systemd[1]: Reached target network.target - Network. Dec 16 12:17:48.584781 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 12:17:48.591825 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 12:17:48.632570 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 12:17:48.650274 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 16 12:17:48.665019 systemd-networkd[1470]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:17:48.665034 systemd-networkd[1470]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:17:48.666183 systemd-networkd[1470]: eth1: Link UP Dec 16 12:17:48.668599 systemd-networkd[1470]: eth1: Gained carrier Dec 16 12:17:48.668640 systemd-networkd[1470]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:17:48.707021 systemd-networkd[1470]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 12:17:48.711053 systemd-timesyncd[1437]: Network configuration changed, trying to establish connection. Dec 16 12:17:48.740281 systemd-networkd[1470]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:17:48.740295 systemd-networkd[1470]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:17:48.748195 systemd-timesyncd[1437]: Network configuration changed, trying to establish connection. Dec 16 12:17:48.749274 systemd-networkd[1470]: eth0: Link UP Dec 16 12:17:48.750396 systemd-timesyncd[1437]: Network configuration changed, trying to establish connection. Dec 16 12:17:48.751798 systemd-networkd[1470]: eth0: Gained carrier Dec 16 12:17:48.751832 systemd-networkd[1470]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:17:48.760508 systemd-timesyncd[1437]: Network configuration changed, trying to establish connection. Dec 16 12:17:48.770697 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 12:17:48.803515 ldconfig[1422]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 12:17:48.809847 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 12:17:48.815510 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 12:17:48.816214 systemd-networkd[1470]: eth0: DHCPv4 address 46.224.130.63/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 12:17:48.817753 systemd-timesyncd[1437]: Network configuration changed, trying to establish connection. Dec 16 12:17:48.818120 systemd-timesyncd[1437]: Network configuration changed, trying to establish connection. Dec 16 12:17:48.827768 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 12:17:48.831402 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 12:17:48.857744 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 12:17:48.859056 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:17:48.859819 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 12:17:48.860457 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 12:17:48.861570 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 12:17:48.862396 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 12:17:48.863256 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Dec 16 12:17:48.865084 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Dec 16 12:17:48.865955 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 12:17:48.867048 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 12:17:48.867139 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:17:48.868386 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:17:48.870314 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 12:17:48.874791 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 12:17:48.880780 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 12:17:48.882981 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 12:17:48.884711 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 12:17:48.894161 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 12:17:48.896718 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 12:17:48.901371 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 12:17:48.902399 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 12:17:48.906830 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:17:48.907420 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:17:48.908034 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:17:48.908083 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:17:48.909262 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 12:17:48.914834 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 12:17:48.918009 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 12:17:48.921553 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 12:17:48.930846 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 12:17:48.934234 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 12:17:48.934817 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 12:17:48.937028 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 12:17:48.939851 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 12:17:48.955851 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Dec 16 12:17:48.955912 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 12:17:48.955926 kernel: [drm] features: -context_init Dec 16 12:17:48.970687 coreos-metadata[1524]: Dec 16 12:17:48.970 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Dec 16 12:17:48.971653 kernel: [drm] number of scanouts: 1 Dec 16 12:17:48.971735 kernel: [drm] number of cap sets: 0 Dec 16 12:17:48.972476 coreos-metadata[1524]: Dec 16 12:17:48.972 INFO Fetch successful Dec 16 12:17:48.972814 coreos-metadata[1524]: Dec 16 12:17:48.972 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Dec 16 12:17:48.978422 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 16 12:17:48.976883 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 12:17:48.979199 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 12:17:48.982736 coreos-metadata[1524]: Dec 16 12:17:48.973 INFO Fetch successful Dec 16 12:17:48.987317 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 12:17:48.987978 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 12:17:48.988553 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 12:17:48.992914 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 12:17:48.993805 jq[1529]: false Dec 16 12:17:49.007503 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 12:17:49.016665 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 12:17:49.025855 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 12:17:49.034258 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 12:17:49.036979 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 12:17:49.037235 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 12:17:49.043170 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Dec 16 12:17:49.043426 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 12:17:49.044734 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 12:17:49.064946 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Dec 16 12:17:49.087469 jq[1549]: true Dec 16 12:17:49.114378 dbus-daemon[1525]: [system] SELinux support is enabled Dec 16 12:17:49.119324 update_engine[1544]: I20251216 12:17:49.118961 1544 main.cc:92] Flatcar Update Engine starting Dec 16 12:17:49.126104 extend-filesystems[1530]: Found /dev/sda6 Dec 16 12:17:49.128808 update_engine[1544]: I20251216 12:17:49.128615 1544 update_check_scheduler.cc:74] Next update check in 10m51s Dec 16 12:17:49.133105 extend-filesystems[1530]: Found /dev/sda9 Dec 16 12:17:49.137759 extend-filesystems[1530]: Checking size of /dev/sda9 Dec 16 12:17:49.151103 extend-filesystems[1530]: Resized partition /dev/sda9 Dec 16 12:17:49.154824 extend-filesystems[1589]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 12:17:49.160722 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 8410107 blocks Dec 16 12:17:49.165631 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 12:17:49.171948 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 12:17:49.174695 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 12:17:49.182368 systemd[1]: Started update-engine.service - Update Engine. Dec 16 12:17:49.189844 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 12:17:49.189899 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 12:17:49.193373 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 12:17:49.193393 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 12:17:49.212259 jq[1582]: true Dec 16 12:17:49.217608 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 12:17:49.224400 tar[1554]: linux-arm64/LICENSE Dec 16 12:17:49.224400 tar[1554]: linux-arm64/helm Dec 16 12:17:49.270293 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 12:17:49.273112 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 12:17:49.278675 kernel: EXT4-fs (sda9): resized filesystem to 8410107 Dec 16 12:17:49.289264 extend-filesystems[1589]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Dec 16 12:17:49.289264 extend-filesystems[1589]: old_desc_blocks = 1, new_desc_blocks = 5 Dec 16 12:17:49.289264 extend-filesystems[1589]: The filesystem on /dev/sda9 is now 8410107 (4k) blocks long. Dec 16 12:17:49.291921 extend-filesystems[1530]: Resized filesystem in /dev/sda9 Dec 16 12:17:49.291132 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 12:17:49.294160 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 12:17:49.311819 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:17:49.339915 bash[1617]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:17:49.346503 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 12:17:49.355521 systemd[1]: Starting sshkeys.service... Dec 16 12:17:49.404670 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 12:17:49.408687 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 12:17:49.467785 containerd[1576]: time="2025-12-16T12:17:49Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 12:17:49.477406 containerd[1576]: time="2025-12-16T12:17:49.477243600Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Dec 16 12:17:49.503807 coreos-metadata[1628]: Dec 16 12:17:49.503 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Dec 16 12:17:49.503807 coreos-metadata[1628]: Dec 16 12:17:49.503 INFO Fetch successful Dec 16 12:17:49.508128 unknown[1628]: wrote ssh authorized keys file for user: core Dec 16 12:17:49.532894 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:17:49.535884 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:17:49.546966 containerd[1576]: time="2025-12-16T12:17:49.546920640Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.4µs" Dec 16 12:17:49.547095 containerd[1576]: time="2025-12-16T12:17:49.547078000Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 12:17:49.547178 containerd[1576]: time="2025-12-16T12:17:49.547165280Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 12:17:49.547226 containerd[1576]: time="2025-12-16T12:17:49.547215440Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 12:17:49.547470 containerd[1576]: time="2025-12-16T12:17:49.547447960Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 12:17:49.547543 containerd[1576]: time="2025-12-16T12:17:49.547530680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:17:49.549412 containerd[1576]: time="2025-12-16T12:17:49.549370240Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:17:49.550651 containerd[1576]: time="2025-12-16T12:17:49.550164360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:17:49.550651 containerd[1576]: time="2025-12-16T12:17:49.550541080Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:17:49.550651 containerd[1576]: time="2025-12-16T12:17:49.550559760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:17:49.550651 containerd[1576]: time="2025-12-16T12:17:49.550572840Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:17:49.550651 containerd[1576]: time="2025-12-16T12:17:49.550582640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:17:49.553645 containerd[1576]: time="2025-12-16T12:17:49.552748520Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:17:49.553645 containerd[1576]: time="2025-12-16T12:17:49.552793280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 12:17:49.553645 containerd[1576]: time="2025-12-16T12:17:49.552892880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 12:17:49.553645 containerd[1576]: time="2025-12-16T12:17:49.553111760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:17:49.553645 containerd[1576]: time="2025-12-16T12:17:49.553145560Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:17:49.553645 containerd[1576]: time="2025-12-16T12:17:49.553155960Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 12:17:49.553645 containerd[1576]: time="2025-12-16T12:17:49.553183120Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 12:17:49.553645 containerd[1576]: time="2025-12-16T12:17:49.553421960Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 12:17:49.553645 containerd[1576]: time="2025-12-16T12:17:49.553483240Z" level=info msg="metadata content store policy set" policy=shared Dec 16 12:17:49.555542 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:17:49.566455 containerd[1576]: time="2025-12-16T12:17:49.566217520Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 12:17:49.566455 containerd[1576]: time="2025-12-16T12:17:49.566300280Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:17:49.566566 containerd[1576]: time="2025-12-16T12:17:49.566519720Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:17:49.566566 containerd[1576]: time="2025-12-16T12:17:49.566543360Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 12:17:49.566600 containerd[1576]: time="2025-12-16T12:17:49.566574320Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 12:17:49.566600 containerd[1576]: time="2025-12-16T12:17:49.566588880Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 12:17:49.566669 containerd[1576]: time="2025-12-16T12:17:49.566601800Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 12:17:49.566669 containerd[1576]: time="2025-12-16T12:17:49.566612160Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 12:17:49.566669 containerd[1576]: time="2025-12-16T12:17:49.566636280Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 12:17:49.566669 containerd[1576]: time="2025-12-16T12:17:49.566650240Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 12:17:49.566669 containerd[1576]: time="2025-12-16T12:17:49.566661680Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 12:17:49.566746 containerd[1576]: time="2025-12-16T12:17:49.566679400Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 12:17:49.566746 containerd[1576]: time="2025-12-16T12:17:49.566689840Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 12:17:49.566746 containerd[1576]: time="2025-12-16T12:17:49.566710360Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 12:17:49.566917 containerd[1576]: time="2025-12-16T12:17:49.566886160Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 12:17:49.566947 containerd[1576]: time="2025-12-16T12:17:49.566916480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 12:17:49.566947 containerd[1576]: time="2025-12-16T12:17:49.566943720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 12:17:49.566989 containerd[1576]: time="2025-12-16T12:17:49.566955520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 12:17:49.566989 containerd[1576]: time="2025-12-16T12:17:49.566966360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 12:17:49.566989 containerd[1576]: time="2025-12-16T12:17:49.566976040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 12:17:49.567038 containerd[1576]: time="2025-12-16T12:17:49.567000280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 12:17:49.567038 containerd[1576]: time="2025-12-16T12:17:49.567019960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 12:17:49.567038 containerd[1576]: time="2025-12-16T12:17:49.567032760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 12:17:49.567134 containerd[1576]: time="2025-12-16T12:17:49.567043520Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 12:17:49.567134 containerd[1576]: time="2025-12-16T12:17:49.567065680Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 12:17:49.567134 containerd[1576]: time="2025-12-16T12:17:49.567106680Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 12:17:49.567183 containerd[1576]: time="2025-12-16T12:17:49.567151840Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 12:17:49.567183 containerd[1576]: time="2025-12-16T12:17:49.567174800Z" level=info msg="Start snapshots syncer" Dec 16 12:17:49.567224 containerd[1576]: time="2025-12-16T12:17:49.567211360Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 12:17:49.569308 containerd[1576]: time="2025-12-16T12:17:49.568921200Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 12:17:49.571108 containerd[1576]: time="2025-12-16T12:17:49.569006240Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 12:17:49.571108 containerd[1576]: time="2025-12-16T12:17:49.570737960Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 12:17:49.571108 containerd[1576]: time="2025-12-16T12:17:49.570929880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 12:17:49.571108 containerd[1576]: time="2025-12-16T12:17:49.570970280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 12:17:49.571108 containerd[1576]: time="2025-12-16T12:17:49.570982160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 12:17:49.571108 containerd[1576]: time="2025-12-16T12:17:49.570993080Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 12:17:49.571108 containerd[1576]: time="2025-12-16T12:17:49.571028280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 12:17:49.571108 containerd[1576]: time="2025-12-16T12:17:49.571041960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 12:17:49.571108 containerd[1576]: time="2025-12-16T12:17:49.571052440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 12:17:49.571333 containerd[1576]: time="2025-12-16T12:17:49.571127240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 12:17:49.571333 containerd[1576]: time="2025-12-16T12:17:49.571140800Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 12:17:49.571333 containerd[1576]: time="2025-12-16T12:17:49.571187680Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:17:49.571333 containerd[1576]: time="2025-12-16T12:17:49.571206560Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:17:49.571333 containerd[1576]: time="2025-12-16T12:17:49.571216560Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:17:49.571333 containerd[1576]: time="2025-12-16T12:17:49.571231600Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:17:49.571333 containerd[1576]: time="2025-12-16T12:17:49.571240600Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 12:17:49.571333 containerd[1576]: time="2025-12-16T12:17:49.571259120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 12:17:49.571333 containerd[1576]: time="2025-12-16T12:17:49.571271960Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 12:17:49.574049 containerd[1576]: time="2025-12-16T12:17:49.573633080Z" level=info msg="runtime interface created" Dec 16 12:17:49.574049 containerd[1576]: time="2025-12-16T12:17:49.573649160Z" level=info msg="created NRI interface" Dec 16 12:17:49.574049 containerd[1576]: time="2025-12-16T12:17:49.573659040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 12:17:49.574049 containerd[1576]: time="2025-12-16T12:17:49.573675440Z" level=info msg="Connect containerd service" Dec 16 12:17:49.574049 containerd[1576]: time="2025-12-16T12:17:49.573727000Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 12:17:49.574959 containerd[1576]: time="2025-12-16T12:17:49.574768680Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:17:49.629982 update-ssh-keys[1637]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:17:49.631498 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 12:17:49.640733 systemd[1]: Finished sshkeys.service. Dec 16 12:17:49.748925 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:17:49.765248 systemd-logind[1543]: New seat seat0. Dec 16 12:17:49.774728 systemd-logind[1543]: Watching system buttons on /dev/input/event0 (Power Button) Dec 16 12:17:49.774751 systemd-logind[1543]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 16 12:17:49.775117 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 12:17:49.791274 locksmithd[1596]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.864719920Z" level=info msg="Start subscribing containerd event" Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.864802360Z" level=info msg="Start recovering state" Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.864911600Z" level=info msg="Start event monitor" Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.864927960Z" level=info msg="Start cni network conf syncer for default" Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.864936400Z" level=info msg="Start streaming server" Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.864944480Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.864951840Z" level=info msg="runtime interface starting up..." Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.864957320Z" level=info msg="starting plugins..." Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.864970640Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.865446240Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.865492200Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 12:17:49.867152 containerd[1576]: time="2025-12-16T12:17:49.865538600Z" level=info msg="containerd successfully booted in 0.400531s" Dec 16 12:17:49.866907 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 12:17:49.931764 systemd-networkd[1470]: eth1: Gained IPv6LL Dec 16 12:17:49.932300 systemd-timesyncd[1437]: Network configuration changed, trying to establish connection. Dec 16 12:17:49.937300 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 12:17:49.938773 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 12:17:49.942989 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:17:49.947471 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 12:17:50.012331 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 12:17:50.090817 tar[1554]: linux-arm64/README.md Dec 16 12:17:50.108295 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 12:17:50.508718 systemd-networkd[1470]: eth0: Gained IPv6LL Dec 16 12:17:50.509284 systemd-timesyncd[1437]: Network configuration changed, trying to establish connection. Dec 16 12:17:50.805456 sshd_keygen[1571]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 12:17:50.833987 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 12:17:50.836307 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:17:50.842978 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 12:17:50.847171 (kubelet)[1694]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:17:50.858504 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 12:17:50.858852 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 12:17:50.862967 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 12:17:50.883994 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 12:17:50.886679 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 12:17:50.888984 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 16 12:17:50.889914 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 12:17:50.890489 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 12:17:50.891889 systemd[1]: Startup finished in 1.824s (kernel) + 5.197s (initrd) + 4.593s (userspace) = 11.615s. Dec 16 12:17:51.310299 kubelet[1694]: E1216 12:17:51.310219 1694 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:17:51.314472 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:17:51.314768 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:17:51.315231 systemd[1]: kubelet.service: Consumed 853ms CPU time, 246.9M memory peak. Dec 16 12:18:01.384760 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 12:18:01.387349 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:18:01.572778 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:18:01.587501 (kubelet)[1724]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:18:01.641188 kubelet[1724]: E1216 12:18:01.641077 1724 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:18:01.644678 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:18:01.644871 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:18:01.645825 systemd[1]: kubelet.service: Consumed 182ms CPU time, 107.5M memory peak. Dec 16 12:18:11.884755 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 12:18:11.888365 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:18:12.070914 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:18:12.086205 (kubelet)[1739]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:18:12.128959 kubelet[1739]: E1216 12:18:12.128890 1739 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:18:12.131833 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:18:12.131981 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:18:12.132343 systemd[1]: kubelet.service: Consumed 174ms CPU time, 107M memory peak. Dec 16 12:18:20.814522 systemd-timesyncd[1437]: Contacted time server 213.209.109.45:123 (2.flatcar.pool.ntp.org). Dec 16 12:18:20.815310 systemd-timesyncd[1437]: Initial clock synchronization to Tue 2025-12-16 12:18:20.481850 UTC. Dec 16 12:18:22.134504 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 12:18:22.137915 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:18:22.317910 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:18:22.335680 (kubelet)[1754]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:18:22.385242 kubelet[1754]: E1216 12:18:22.385116 1754 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:18:22.388212 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:18:22.388403 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:18:22.389291 systemd[1]: kubelet.service: Consumed 190ms CPU time, 106.2M memory peak. Dec 16 12:18:23.968384 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 12:18:23.970944 systemd[1]: Started sshd@0-46.224.130.63:22-147.75.109.163:46876.service - OpenSSH per-connection server daemon (147.75.109.163:46876). Dec 16 12:18:24.817412 sshd[1762]: Accepted publickey for core from 147.75.109.163 port 46876 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:18:24.821632 sshd-session[1762]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:24.832081 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 12:18:24.834037 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 12:18:24.839113 systemd-logind[1543]: New session 1 of user core. Dec 16 12:18:24.866263 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 12:18:24.871180 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 12:18:24.888663 (systemd)[1768]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:24.892126 systemd-logind[1543]: New session 2 of user core. Dec 16 12:18:25.042781 systemd[1768]: Queued start job for default target default.target. Dec 16 12:18:25.054548 systemd[1768]: Created slice app.slice - User Application Slice. Dec 16 12:18:25.054651 systemd[1768]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Dec 16 12:18:25.054687 systemd[1768]: Reached target paths.target - Paths. Dec 16 12:18:25.054784 systemd[1768]: Reached target timers.target - Timers. Dec 16 12:18:25.057099 systemd[1768]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 12:18:25.059816 systemd[1768]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Dec 16 12:18:25.085098 systemd[1768]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 12:18:25.085219 systemd[1768]: Reached target sockets.target - Sockets. Dec 16 12:18:25.085891 systemd[1768]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Dec 16 12:18:25.086059 systemd[1768]: Reached target basic.target - Basic System. Dec 16 12:18:25.086133 systemd[1768]: Reached target default.target - Main User Target. Dec 16 12:18:25.086171 systemd[1768]: Startup finished in 186ms. Dec 16 12:18:25.086451 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 12:18:25.098015 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 12:18:25.582930 systemd[1]: Started sshd@1-46.224.130.63:22-147.75.109.163:46892.service - OpenSSH per-connection server daemon (147.75.109.163:46892). Dec 16 12:18:26.475311 sshd[1782]: Accepted publickey for core from 147.75.109.163 port 46892 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:18:26.477279 sshd-session[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:26.483826 systemd-logind[1543]: New session 3 of user core. Dec 16 12:18:26.495001 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 12:18:26.974804 sshd[1786]: Connection closed by 147.75.109.163 port 46892 Dec 16 12:18:26.975787 sshd-session[1782]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:26.981912 systemd-logind[1543]: Session 3 logged out. Waiting for processes to exit. Dec 16 12:18:26.982497 systemd[1]: sshd@1-46.224.130.63:22-147.75.109.163:46892.service: Deactivated successfully. Dec 16 12:18:26.985096 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 12:18:26.987511 systemd-logind[1543]: Removed session 3. Dec 16 12:18:27.144954 systemd[1]: Started sshd@2-46.224.130.63:22-147.75.109.163:46908.service - OpenSSH per-connection server daemon (147.75.109.163:46908). Dec 16 12:18:27.963712 sshd[1792]: Accepted publickey for core from 147.75.109.163 port 46908 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:18:27.965402 sshd-session[1792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:27.970754 systemd-logind[1543]: New session 4 of user core. Dec 16 12:18:27.980938 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 12:18:28.423480 sshd[1796]: Connection closed by 147.75.109.163 port 46908 Dec 16 12:18:28.424353 sshd-session[1792]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:28.431211 systemd[1]: sshd@2-46.224.130.63:22-147.75.109.163:46908.service: Deactivated successfully. Dec 16 12:18:28.434185 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 12:18:28.437825 systemd-logind[1543]: Session 4 logged out. Waiting for processes to exit. Dec 16 12:18:28.439146 systemd-logind[1543]: Removed session 4. Dec 16 12:18:28.604947 systemd[1]: Started sshd@3-46.224.130.63:22-147.75.109.163:46924.service - OpenSSH per-connection server daemon (147.75.109.163:46924). Dec 16 12:18:29.459936 sshd[1802]: Accepted publickey for core from 147.75.109.163 port 46924 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:18:29.461751 sshd-session[1802]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:29.468551 systemd-logind[1543]: New session 5 of user core. Dec 16 12:18:29.473987 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 12:18:29.937588 sshd[1806]: Connection closed by 147.75.109.163 port 46924 Dec 16 12:18:29.938240 sshd-session[1802]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:29.943374 systemd-logind[1543]: Session 5 logged out. Waiting for processes to exit. Dec 16 12:18:29.944125 systemd[1]: sshd@3-46.224.130.63:22-147.75.109.163:46924.service: Deactivated successfully. Dec 16 12:18:29.947069 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 12:18:29.949517 systemd-logind[1543]: Removed session 5. Dec 16 12:18:30.105243 systemd[1]: Started sshd@4-46.224.130.63:22-147.75.109.163:46934.service - OpenSSH per-connection server daemon (147.75.109.163:46934). Dec 16 12:18:30.923088 sshd[1812]: Accepted publickey for core from 147.75.109.163 port 46934 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:18:30.925163 sshd-session[1812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:30.930103 systemd-logind[1543]: New session 6 of user core. Dec 16 12:18:30.940994 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 12:18:31.246186 sudo[1817]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 12:18:31.246480 sudo[1817]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:18:31.258102 sudo[1817]: pam_unix(sudo:session): session closed for user root Dec 16 12:18:31.411775 sshd[1816]: Connection closed by 147.75.109.163 port 46934 Dec 16 12:18:31.412970 sshd-session[1812]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:31.419786 systemd[1]: sshd@4-46.224.130.63:22-147.75.109.163:46934.service: Deactivated successfully. Dec 16 12:18:31.422782 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 12:18:31.425377 systemd-logind[1543]: Session 6 logged out. Waiting for processes to exit. Dec 16 12:18:31.427838 systemd-logind[1543]: Removed session 6. Dec 16 12:18:31.584322 systemd[1]: Started sshd@5-46.224.130.63:22-147.75.109.163:46938.service - OpenSSH per-connection server daemon (147.75.109.163:46938). Dec 16 12:18:32.446818 sshd[1824]: Accepted publickey for core from 147.75.109.163 port 46938 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:18:32.449736 sshd-session[1824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:32.451112 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 16 12:18:32.454287 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:18:32.457666 systemd-logind[1543]: New session 7 of user core. Dec 16 12:18:32.466234 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 12:18:32.617167 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:18:32.627129 (kubelet)[1837]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:18:32.673128 kubelet[1837]: E1216 12:18:32.673075 1837 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:18:32.675946 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:18:32.676113 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:18:32.676897 systemd[1]: kubelet.service: Consumed 172ms CPU time, 106.6M memory peak. Dec 16 12:18:32.775756 sudo[1845]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 12:18:32.776569 sudo[1845]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:18:32.782263 sudo[1845]: pam_unix(sudo:session): session closed for user root Dec 16 12:18:32.792052 sudo[1844]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 12:18:32.792317 sudo[1844]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:18:32.803968 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:18:32.869000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:18:32.870974 kernel: kauditd_printk_skb: 178 callbacks suppressed Dec 16 12:18:32.871035 kernel: audit: type=1305 audit(1765887512.869:223): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:18:32.871234 augenrules[1869]: No rules Dec 16 12:18:32.869000 audit[1869]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffda5004e0 a2=420 a3=0 items=0 ppid=1850 pid=1869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:32.874529 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:18:32.874942 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:18:32.875681 kernel: audit: type=1300 audit(1765887512.869:223): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffda5004e0 a2=420 a3=0 items=0 ppid=1850 pid=1869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:32.875727 kernel: audit: type=1327 audit(1765887512.869:223): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:18:32.869000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:18:32.876000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:32.878943 sudo[1844]: pam_unix(sudo:session): session closed for user root Dec 16 12:18:32.876000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:32.880940 kernel: audit: type=1130 audit(1765887512.876:224): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:32.880994 kernel: audit: type=1131 audit(1765887512.876:225): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:32.878000 audit[1844]: USER_END pid=1844 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:18:32.883262 kernel: audit: type=1106 audit(1765887512.878:226): pid=1844 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:18:32.883335 kernel: audit: type=1104 audit(1765887512.878:227): pid=1844 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:18:32.878000 audit[1844]: CRED_DISP pid=1844 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:18:33.037931 sshd[1831]: Connection closed by 147.75.109.163 port 46938 Dec 16 12:18:33.039392 sshd-session[1824]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:33.041000 audit[1824]: USER_END pid=1824 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:18:33.051534 kernel: audit: type=1106 audit(1765887513.041:228): pid=1824 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:18:33.051667 kernel: audit: type=1104 audit(1765887513.041:229): pid=1824 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:18:33.041000 audit[1824]: CRED_DISP pid=1824 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:18:33.048882 systemd-logind[1543]: Session 7 logged out. Waiting for processes to exit. Dec 16 12:18:33.051271 systemd[1]: sshd@5-46.224.130.63:22-147.75.109.163:46938.service: Deactivated successfully. Dec 16 12:18:33.051000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-46.224.130.63:22-147.75.109.163:46938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:33.054642 kernel: audit: type=1131 audit(1765887513.051:230): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-46.224.130.63:22-147.75.109.163:46938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:33.055155 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 12:18:33.056848 systemd-logind[1543]: Removed session 7. Dec 16 12:18:33.204000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-46.224.130.63:22-147.75.109.163:39552 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:33.204761 systemd[1]: Started sshd@6-46.224.130.63:22-147.75.109.163:39552.service - OpenSSH per-connection server daemon (147.75.109.163:39552). Dec 16 12:18:34.047000 audit[1878]: USER_ACCT pid=1878 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:18:34.048485 sshd[1878]: Accepted publickey for core from 147.75.109.163 port 39552 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:18:34.049000 audit[1878]: CRED_ACQ pid=1878 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:18:34.049000 audit[1878]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffb549930 a2=3 a3=0 items=0 ppid=1 pid=1878 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:34.049000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:34.051225 sshd-session[1878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:34.056941 systemd-logind[1543]: New session 8 of user core. Dec 16 12:18:34.065965 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 12:18:34.068000 audit[1878]: USER_START pid=1878 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:18:34.070000 audit[1882]: CRED_ACQ pid=1882 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:18:34.277480 update_engine[1544]: I20251216 12:18:34.276728 1544 update_attempter.cc:509] Updating boot flags... Dec 16 12:18:34.376000 audit[1899]: USER_ACCT pid=1899 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:18:34.379000 audit[1899]: CRED_REFR pid=1899 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:18:34.380428 sudo[1899]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 12:18:34.381000 audit[1899]: USER_START pid=1899 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:18:34.381975 sudo[1899]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:18:34.761656 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 12:18:34.791183 (dockerd)[1921]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 12:18:35.030308 dockerd[1921]: time="2025-12-16T12:18:35.030122878Z" level=info msg="Starting up" Dec 16 12:18:35.032183 dockerd[1921]: time="2025-12-16T12:18:35.031976562Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 12:18:35.047590 dockerd[1921]: time="2025-12-16T12:18:35.047503257Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 12:18:35.066300 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1499705500-merged.mount: Deactivated successfully. Dec 16 12:18:35.082879 systemd[1]: var-lib-docker-metacopy\x2dcheck1313554044-merged.mount: Deactivated successfully. Dec 16 12:18:35.090499 dockerd[1921]: time="2025-12-16T12:18:35.090434079Z" level=info msg="Loading containers: start." Dec 16 12:18:35.100662 kernel: Initializing XFRM netlink socket Dec 16 12:18:35.163000 audit[1968]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1968 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.163000 audit[1968]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffff026d630 a2=0 a3=0 items=0 ppid=1921 pid=1968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.163000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:18:35.166000 audit[1970]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1970 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.166000 audit[1970]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffc9dbb1c0 a2=0 a3=0 items=0 ppid=1921 pid=1970 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.166000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:18:35.168000 audit[1972]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1972 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.168000 audit[1972]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe706f500 a2=0 a3=0 items=0 ppid=1921 pid=1972 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.168000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:18:35.171000 audit[1974]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1974 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.171000 audit[1974]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffee08d110 a2=0 a3=0 items=0 ppid=1921 pid=1974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.171000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:18:35.174000 audit[1976]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=1976 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.174000 audit[1976]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe98db040 a2=0 a3=0 items=0 ppid=1921 pid=1976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.174000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:18:35.176000 audit[1978]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=1978 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.176000 audit[1978]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffcce5f4d0 a2=0 a3=0 items=0 ppid=1921 pid=1978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.176000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:18:35.178000 audit[1980]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1980 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.178000 audit[1980]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffff3252300 a2=0 a3=0 items=0 ppid=1921 pid=1980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.178000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:18:35.181000 audit[1982]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=1982 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.181000 audit[1982]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffecf29670 a2=0 a3=0 items=0 ppid=1921 pid=1982 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.181000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:18:35.212000 audit[1985]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=1985 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.212000 audit[1985]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffd0fc48f0 a2=0 a3=0 items=0 ppid=1921 pid=1985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.212000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Dec 16 12:18:35.214000 audit[1987]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=1987 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.214000 audit[1987]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffeb9aae60 a2=0 a3=0 items=0 ppid=1921 pid=1987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.214000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:18:35.218000 audit[1989]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1989 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.218000 audit[1989]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffeba6d090 a2=0 a3=0 items=0 ppid=1921 pid=1989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.218000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:18:35.220000 audit[1991]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=1991 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.220000 audit[1991]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=fffff36bc730 a2=0 a3=0 items=0 ppid=1921 pid=1991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.220000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:18:35.222000 audit[1993]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=1993 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.222000 audit[1993]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffd2c0e3b0 a2=0 a3=0 items=0 ppid=1921 pid=1993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.222000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:18:35.266000 audit[2023]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2023 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.266000 audit[2023]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffe0692df0 a2=0 a3=0 items=0 ppid=1921 pid=2023 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.266000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:18:35.269000 audit[2025]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2025 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.269000 audit[2025]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffe88e8630 a2=0 a3=0 items=0 ppid=1921 pid=2025 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.269000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:18:35.271000 audit[2027]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2027 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.271000 audit[2027]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffddc79d60 a2=0 a3=0 items=0 ppid=1921 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.271000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:18:35.273000 audit[2029]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.273000 audit[2029]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc68a3250 a2=0 a3=0 items=0 ppid=1921 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.273000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:18:35.275000 audit[2031]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2031 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.275000 audit[2031]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffcb5aa020 a2=0 a3=0 items=0 ppid=1921 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.275000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:18:35.277000 audit[2033]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2033 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.277000 audit[2033]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffcfc41180 a2=0 a3=0 items=0 ppid=1921 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.277000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:18:35.279000 audit[2035]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2035 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.279000 audit[2035]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe066cbf0 a2=0 a3=0 items=0 ppid=1921 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.279000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:18:35.281000 audit[2037]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.281000 audit[2037]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffcef93030 a2=0 a3=0 items=0 ppid=1921 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.281000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:18:35.284000 audit[2039]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2039 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.284000 audit[2039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffc24af3f0 a2=0 a3=0 items=0 ppid=1921 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.284000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Dec 16 12:18:35.286000 audit[2041]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2041 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.286000 audit[2041]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffd7110440 a2=0 a3=0 items=0 ppid=1921 pid=2041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.286000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:18:35.289000 audit[2043]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2043 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.289000 audit[2043]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=fffffe531900 a2=0 a3=0 items=0 ppid=1921 pid=2043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.289000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:18:35.291000 audit[2045]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2045 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.291000 audit[2045]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffdf61dee0 a2=0 a3=0 items=0 ppid=1921 pid=2045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.291000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:18:35.293000 audit[2047]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2047 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.293000 audit[2047]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffebacedc0 a2=0 a3=0 items=0 ppid=1921 pid=2047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.293000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:18:35.298000 audit[2052]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.298000 audit[2052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe62f9d20 a2=0 a3=0 items=0 ppid=1921 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.298000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:18:35.301000 audit[2054]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.301000 audit[2054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffec1a4040 a2=0 a3=0 items=0 ppid=1921 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.301000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:18:35.303000 audit[2056]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.303000 audit[2056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffdd216000 a2=0 a3=0 items=0 ppid=1921 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.303000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:18:35.305000 audit[2058]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2058 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.305000 audit[2058]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffca95c350 a2=0 a3=0 items=0 ppid=1921 pid=2058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.305000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:18:35.307000 audit[2060]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2060 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.307000 audit[2060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffca74b7a0 a2=0 a3=0 items=0 ppid=1921 pid=2060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.307000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:18:35.309000 audit[2062]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2062 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:35.309000 audit[2062]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffffcca560 a2=0 a3=0 items=0 ppid=1921 pid=2062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.309000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:18:35.328000 audit[2066]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2066 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.328000 audit[2066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffd6e656e0 a2=0 a3=0 items=0 ppid=1921 pid=2066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.328000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Dec 16 12:18:35.334000 audit[2068]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2068 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.334000 audit[2068]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=fffffd940960 a2=0 a3=0 items=0 ppid=1921 pid=2068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.334000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Dec 16 12:18:35.343000 audit[2076]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2076 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.343000 audit[2076]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffd1a6a4e0 a2=0 a3=0 items=0 ppid=1921 pid=2076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.343000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Dec 16 12:18:35.360000 audit[2082]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2082 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.360000 audit[2082]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=fffff8cdcd30 a2=0 a3=0 items=0 ppid=1921 pid=2082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.360000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Dec 16 12:18:35.364000 audit[2084]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2084 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.364000 audit[2084]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffefca1b20 a2=0 a3=0 items=0 ppid=1921 pid=2084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.364000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Dec 16 12:18:35.368000 audit[2086]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2086 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.368000 audit[2086]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=fffff6aaf9a0 a2=0 a3=0 items=0 ppid=1921 pid=2086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.368000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Dec 16 12:18:35.371000 audit[2088]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2088 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.371000 audit[2088]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=fffffe859d50 a2=0 a3=0 items=0 ppid=1921 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.371000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:18:35.374000 audit[2090]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2090 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:35.374000 audit[2090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffd902a420 a2=0 a3=0 items=0 ppid=1921 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:35.374000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Dec 16 12:18:35.376466 systemd-networkd[1470]: docker0: Link UP Dec 16 12:18:35.382173 dockerd[1921]: time="2025-12-16T12:18:35.382120859Z" level=info msg="Loading containers: done." Dec 16 12:18:35.406687 dockerd[1921]: time="2025-12-16T12:18:35.406572907Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 12:18:35.406687 dockerd[1921]: time="2025-12-16T12:18:35.406695635Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 12:18:35.406994 dockerd[1921]: time="2025-12-16T12:18:35.406882112Z" level=info msg="Initializing buildkit" Dec 16 12:18:35.430860 dockerd[1921]: time="2025-12-16T12:18:35.430809981Z" level=info msg="Completed buildkit initialization" Dec 16 12:18:35.438371 dockerd[1921]: time="2025-12-16T12:18:35.438298084Z" level=info msg="Daemon has completed initialization" Dec 16 12:18:35.438639 dockerd[1921]: time="2025-12-16T12:18:35.438522794Z" level=info msg="API listen on /run/docker.sock" Dec 16 12:18:35.439000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:35.439767 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 12:18:36.062909 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck610769668-merged.mount: Deactivated successfully. Dec 16 12:18:36.244057 containerd[1576]: time="2025-12-16T12:18:36.243996708Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 16 12:18:36.937461 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount446917140.mount: Deactivated successfully. Dec 16 12:18:37.653082 containerd[1576]: time="2025-12-16T12:18:37.653007636Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:37.654646 containerd[1576]: time="2025-12-16T12:18:37.654470264Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=22974850" Dec 16 12:18:37.656386 containerd[1576]: time="2025-12-16T12:18:37.656327825Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:37.659660 containerd[1576]: time="2025-12-16T12:18:37.659487406Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:37.660691 containerd[1576]: time="2025-12-16T12:18:37.660646093Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.416601845s" Dec 16 12:18:37.660691 containerd[1576]: time="2025-12-16T12:18:37.660692425Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Dec 16 12:18:37.661647 containerd[1576]: time="2025-12-16T12:18:37.661421589Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 16 12:18:39.083856 containerd[1576]: time="2025-12-16T12:18:39.083799870Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:39.085006 containerd[1576]: time="2025-12-16T12:18:39.084694019Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=19127323" Dec 16 12:18:39.086056 containerd[1576]: time="2025-12-16T12:18:39.086013842Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:39.089411 containerd[1576]: time="2025-12-16T12:18:39.089369748Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:39.090746 containerd[1576]: time="2025-12-16T12:18:39.090481954Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.429027038s" Dec 16 12:18:39.090746 containerd[1576]: time="2025-12-16T12:18:39.090520688Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Dec 16 12:18:39.091188 containerd[1576]: time="2025-12-16T12:18:39.091161990Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 16 12:18:40.259272 containerd[1576]: time="2025-12-16T12:18:40.259177321Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:40.260827 containerd[1576]: time="2025-12-16T12:18:40.260769271Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=14183580" Dec 16 12:18:40.261931 containerd[1576]: time="2025-12-16T12:18:40.261859117Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:40.265990 containerd[1576]: time="2025-12-16T12:18:40.265880894Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:40.266819 containerd[1576]: time="2025-12-16T12:18:40.266541874Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 1.174342593s" Dec 16 12:18:40.266819 containerd[1576]: time="2025-12-16T12:18:40.266580627Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Dec 16 12:18:40.268122 containerd[1576]: time="2025-12-16T12:18:40.268078806Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 16 12:18:41.470418 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2220054179.mount: Deactivated successfully. Dec 16 12:18:41.687278 containerd[1576]: time="2025-12-16T12:18:41.686047673Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:41.687278 containerd[1576]: time="2025-12-16T12:18:41.687187067Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=12960247" Dec 16 12:18:41.688088 containerd[1576]: time="2025-12-16T12:18:41.688045870Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:41.690022 containerd[1576]: time="2025-12-16T12:18:41.689974826Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:41.690768 containerd[1576]: time="2025-12-16T12:18:41.690736868Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.42262437s" Dec 16 12:18:41.690768 containerd[1576]: time="2025-12-16T12:18:41.690768377Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Dec 16 12:18:41.691803 containerd[1576]: time="2025-12-16T12:18:41.691777386Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 16 12:18:42.274423 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3983808881.mount: Deactivated successfully. Dec 16 12:18:42.884252 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Dec 16 12:18:42.886858 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:18:43.059861 containerd[1576]: time="2025-12-16T12:18:43.059805503Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:43.063999 containerd[1576]: time="2025-12-16T12:18:43.063928244Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=19575910" Dec 16 12:18:43.064256 containerd[1576]: time="2025-12-16T12:18:43.064134868Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:43.071804 containerd[1576]: time="2025-12-16T12:18:43.071726435Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:43.073437 containerd[1576]: time="2025-12-16T12:18:43.073339635Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.381415501s" Dec 16 12:18:43.073437 containerd[1576]: time="2025-12-16T12:18:43.073390323Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Dec 16 12:18:43.075097 containerd[1576]: time="2025-12-16T12:18:43.074859281Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 16 12:18:43.099000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:43.100985 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:18:43.103241 kernel: kauditd_printk_skb: 132 callbacks suppressed Dec 16 12:18:43.103325 kernel: audit: type=1130 audit(1765887523.099:281): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:43.113043 (kubelet)[2262]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:18:43.165726 kubelet[2262]: E1216 12:18:43.163455 2262 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:18:43.167585 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:18:43.167984 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:18:43.167000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:18:43.169001 systemd[1]: kubelet.service: Consumed 175ms CPU time, 106.3M memory peak. Dec 16 12:18:43.172694 kernel: audit: type=1131 audit(1765887523.167:282): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:18:43.662422 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2804050568.mount: Deactivated successfully. Dec 16 12:18:43.670912 containerd[1576]: time="2025-12-16T12:18:43.670854147Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:43.672174 containerd[1576]: time="2025-12-16T12:18:43.671907423Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Dec 16 12:18:43.673227 containerd[1576]: time="2025-12-16T12:18:43.673158821Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:43.676674 containerd[1576]: time="2025-12-16T12:18:43.675708834Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:43.676674 containerd[1576]: time="2025-12-16T12:18:43.676410286Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 601.511651ms" Dec 16 12:18:43.676674 containerd[1576]: time="2025-12-16T12:18:43.676443532Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Dec 16 12:18:43.677242 containerd[1576]: time="2025-12-16T12:18:43.677202617Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 16 12:18:44.337567 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2765465114.mount: Deactivated successfully. Dec 16 12:18:47.453660 containerd[1576]: time="2025-12-16T12:18:47.452934474Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:47.455399 containerd[1576]: time="2025-12-16T12:18:47.455342722Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=96314798" Dec 16 12:18:47.457526 containerd[1576]: time="2025-12-16T12:18:47.457470132Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:47.461771 containerd[1576]: time="2025-12-16T12:18:47.461720199Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:47.464656 containerd[1576]: time="2025-12-16T12:18:47.462903949Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 3.78556299s" Dec 16 12:18:47.464656 containerd[1576]: time="2025-12-16T12:18:47.462952646Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Dec 16 12:18:53.071136 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:18:53.070000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:53.071351 systemd[1]: kubelet.service: Consumed 175ms CPU time, 106.3M memory peak. Dec 16 12:18:53.070000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:53.076660 kernel: audit: type=1130 audit(1765887533.070:283): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:53.076814 kernel: audit: type=1131 audit(1765887533.070:284): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:53.076048 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:18:53.118540 systemd[1]: Reload requested from client PID 2355 ('systemctl') (unit session-8.scope)... Dec 16 12:18:53.118556 systemd[1]: Reloading... Dec 16 12:18:53.263840 zram_generator::config[2405]: No configuration found. Dec 16 12:18:53.469283 systemd[1]: Reloading finished in 350 ms. Dec 16 12:18:53.504000 audit: BPF prog-id=61 op=LOAD Dec 16 12:18:53.505706 kernel: audit: type=1334 audit(1765887533.504:285): prog-id=61 op=LOAD Dec 16 12:18:53.505000 audit: BPF prog-id=58 op=UNLOAD Dec 16 12:18:53.506742 kernel: audit: type=1334 audit(1765887533.505:286): prog-id=58 op=UNLOAD Dec 16 12:18:53.506844 kernel: audit: type=1334 audit(1765887533.505:287): prog-id=62 op=LOAD Dec 16 12:18:53.505000 audit: BPF prog-id=62 op=LOAD Dec 16 12:18:53.511710 kernel: audit: type=1334 audit(1765887533.505:288): prog-id=63 op=LOAD Dec 16 12:18:53.511824 kernel: audit: type=1334 audit(1765887533.505:289): prog-id=59 op=UNLOAD Dec 16 12:18:53.511852 kernel: audit: type=1334 audit(1765887533.505:290): prog-id=60 op=UNLOAD Dec 16 12:18:53.511868 kernel: audit: type=1334 audit(1765887533.508:291): prog-id=64 op=LOAD Dec 16 12:18:53.511884 kernel: audit: type=1334 audit(1765887533.508:292): prog-id=47 op=UNLOAD Dec 16 12:18:53.505000 audit: BPF prog-id=63 op=LOAD Dec 16 12:18:53.505000 audit: BPF prog-id=59 op=UNLOAD Dec 16 12:18:53.505000 audit: BPF prog-id=60 op=UNLOAD Dec 16 12:18:53.508000 audit: BPF prog-id=64 op=LOAD Dec 16 12:18:53.508000 audit: BPF prog-id=47 op=UNLOAD Dec 16 12:18:53.508000 audit: BPF prog-id=65 op=LOAD Dec 16 12:18:53.508000 audit: BPF prog-id=66 op=LOAD Dec 16 12:18:53.508000 audit: BPF prog-id=48 op=UNLOAD Dec 16 12:18:53.508000 audit: BPF prog-id=49 op=UNLOAD Dec 16 12:18:53.510000 audit: BPF prog-id=67 op=LOAD Dec 16 12:18:53.510000 audit: BPF prog-id=68 op=LOAD Dec 16 12:18:53.510000 audit: BPF prog-id=54 op=UNLOAD Dec 16 12:18:53.510000 audit: BPF prog-id=55 op=UNLOAD Dec 16 12:18:53.510000 audit: BPF prog-id=69 op=LOAD Dec 16 12:18:53.510000 audit: BPF prog-id=50 op=UNLOAD Dec 16 12:18:53.516000 audit: BPF prog-id=70 op=LOAD Dec 16 12:18:53.516000 audit: BPF prog-id=56 op=UNLOAD Dec 16 12:18:53.518000 audit: BPF prog-id=71 op=LOAD Dec 16 12:18:53.518000 audit: BPF prog-id=51 op=UNLOAD Dec 16 12:18:53.518000 audit: BPF prog-id=72 op=LOAD Dec 16 12:18:53.518000 audit: BPF prog-id=73 op=LOAD Dec 16 12:18:53.518000 audit: BPF prog-id=52 op=UNLOAD Dec 16 12:18:53.519000 audit: BPF prog-id=53 op=UNLOAD Dec 16 12:18:53.519000 audit: BPF prog-id=74 op=LOAD Dec 16 12:18:53.519000 audit: BPF prog-id=41 op=UNLOAD Dec 16 12:18:53.520000 audit: BPF prog-id=75 op=LOAD Dec 16 12:18:53.520000 audit: BPF prog-id=76 op=LOAD Dec 16 12:18:53.520000 audit: BPF prog-id=42 op=UNLOAD Dec 16 12:18:53.520000 audit: BPF prog-id=43 op=UNLOAD Dec 16 12:18:53.521000 audit: BPF prog-id=77 op=LOAD Dec 16 12:18:53.521000 audit: BPF prog-id=57 op=UNLOAD Dec 16 12:18:53.522000 audit: BPF prog-id=78 op=LOAD Dec 16 12:18:53.522000 audit: BPF prog-id=44 op=UNLOAD Dec 16 12:18:53.522000 audit: BPF prog-id=79 op=LOAD Dec 16 12:18:53.522000 audit: BPF prog-id=80 op=LOAD Dec 16 12:18:53.522000 audit: BPF prog-id=45 op=UNLOAD Dec 16 12:18:53.522000 audit: BPF prog-id=46 op=UNLOAD Dec 16 12:18:53.540357 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 12:18:53.540438 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 12:18:53.540000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:18:53.541165 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:18:53.541237 systemd[1]: kubelet.service: Consumed 114ms CPU time, 95.1M memory peak. Dec 16 12:18:53.543829 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:18:53.708755 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:18:53.708000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:53.716986 (kubelet)[2450]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:18:53.762268 kubelet[2450]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:18:53.762730 kubelet[2450]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:18:53.763878 kubelet[2450]: I1216 12:18:53.763808 2450 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:18:55.128850 kubelet[2450]: I1216 12:18:55.128804 2450 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 12:18:55.129256 kubelet[2450]: I1216 12:18:55.129240 2450 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:18:55.130750 kubelet[2450]: I1216 12:18:55.130721 2450 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 12:18:55.130890 kubelet[2450]: I1216 12:18:55.130848 2450 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:18:55.131271 kubelet[2450]: I1216 12:18:55.131251 2450 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:18:55.137701 kubelet[2450]: E1216 12:18:55.137658 2450 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://46.224.130.63:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 46.224.130.63:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 12:18:55.138408 kubelet[2450]: I1216 12:18:55.138380 2450 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:18:55.147211 kubelet[2450]: I1216 12:18:55.147151 2450 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:18:55.150012 kubelet[2450]: I1216 12:18:55.149958 2450 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 12:18:55.150437 kubelet[2450]: I1216 12:18:55.150404 2450 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:18:55.150714 kubelet[2450]: I1216 12:18:55.150514 2450 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-5-8fe0b910ae","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:18:55.150862 kubelet[2450]: I1216 12:18:55.150848 2450 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:18:55.150911 kubelet[2450]: I1216 12:18:55.150903 2450 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 12:18:55.151078 kubelet[2450]: I1216 12:18:55.151061 2450 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 12:18:55.153738 kubelet[2450]: I1216 12:18:55.153705 2450 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:18:55.155581 kubelet[2450]: I1216 12:18:55.155552 2450 kubelet.go:475] "Attempting to sync node with API server" Dec 16 12:18:55.155726 kubelet[2450]: I1216 12:18:55.155711 2450 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:18:55.155818 kubelet[2450]: I1216 12:18:55.155807 2450 kubelet.go:387] "Adding apiserver pod source" Dec 16 12:18:55.155876 kubelet[2450]: I1216 12:18:55.155866 2450 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:18:55.157258 kubelet[2450]: E1216 12:18:55.157222 2450 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://46.224.130.63:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-5-8fe0b910ae&limit=500&resourceVersion=0\": dial tcp 46.224.130.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:18:55.157762 kubelet[2450]: I1216 12:18:55.157741 2450 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:18:55.158522 kubelet[2450]: I1216 12:18:55.158498 2450 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:18:55.158634 kubelet[2450]: I1216 12:18:55.158606 2450 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 12:18:55.158722 kubelet[2450]: W1216 12:18:55.158711 2450 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 12:18:55.162396 kubelet[2450]: I1216 12:18:55.162374 2450 server.go:1262] "Started kubelet" Dec 16 12:18:55.162778 kubelet[2450]: E1216 12:18:55.162754 2450 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://46.224.130.63:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.224.130.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:18:55.165490 kubelet[2450]: I1216 12:18:55.165454 2450 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:18:55.167263 kubelet[2450]: I1216 12:18:55.166345 2450 server.go:310] "Adding debug handlers to kubelet server" Dec 16 12:18:55.167263 kubelet[2450]: I1216 12:18:55.166493 2450 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:18:55.167263 kubelet[2450]: I1216 12:18:55.166566 2450 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 12:18:55.167263 kubelet[2450]: I1216 12:18:55.166893 2450 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:18:55.168322 kubelet[2450]: E1216 12:18:55.167032 2450 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://46.224.130.63:6443/api/v1/namespaces/default/events\": dial tcp 46.224.130.63:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-5-8fe0b910ae.1881b15b6ea3509e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-5-8fe0b910ae,UID:ci-4547-0-0-5-8fe0b910ae,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-5-8fe0b910ae,},FirstTimestamp:2025-12-16 12:18:55.162282142 +0000 UTC m=+1.440326301,LastTimestamp:2025-12-16 12:18:55.162282142 +0000 UTC m=+1.440326301,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-5-8fe0b910ae,}" Dec 16 12:18:55.170445 kubelet[2450]: I1216 12:18:55.170419 2450 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:18:55.170803 kubelet[2450]: I1216 12:18:55.170786 2450 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:18:55.175115 kubelet[2450]: E1216 12:18:55.175069 2450 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" Dec 16 12:18:55.175206 kubelet[2450]: I1216 12:18:55.175128 2450 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 12:18:55.175372 kubelet[2450]: I1216 12:18:55.175350 2450 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 12:18:55.175438 kubelet[2450]: I1216 12:18:55.175422 2450 reconciler.go:29] "Reconciler: start to sync state" Dec 16 12:18:55.176211 kubelet[2450]: E1216 12:18:55.176169 2450 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://46.224.130.63:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.224.130.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:18:55.177389 kubelet[2450]: E1216 12:18:55.177355 2450 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:18:55.178466 kubelet[2450]: I1216 12:18:55.177943 2450 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:18:55.178466 kubelet[2450]: I1216 12:18:55.178207 2450 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:18:55.178000 audit[2466]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2466 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:55.180363 kubelet[2450]: E1216 12:18:55.180000 2450 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.130.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-5-8fe0b910ae?timeout=10s\": dial tcp 46.224.130.63:6443: connect: connection refused" interval="200ms" Dec 16 12:18:55.180363 kubelet[2450]: I1216 12:18:55.180275 2450 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:18:55.178000 audit[2466]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=fffff6972030 a2=0 a3=0 items=0 ppid=2450 pid=2466 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.178000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:18:55.180000 audit[2467]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2467 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:55.180000 audit[2467]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffeee463e0 a2=0 a3=0 items=0 ppid=2450 pid=2467 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.180000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:18:55.184000 audit[2469]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2469 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:55.184000 audit[2469]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff9154020 a2=0 a3=0 items=0 ppid=2450 pid=2469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.184000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:18:55.194000 audit[2472]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2472 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:55.194000 audit[2472]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe5e40800 a2=0 a3=0 items=0 ppid=2450 pid=2472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.194000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:18:55.197910 kubelet[2450]: I1216 12:18:55.197884 2450 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:18:55.198732 kubelet[2450]: I1216 12:18:55.198689 2450 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:18:55.199054 kubelet[2450]: I1216 12:18:55.198818 2450 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:18:55.201081 kubelet[2450]: I1216 12:18:55.200948 2450 policy_none.go:49] "None policy: Start" Dec 16 12:18:55.201694 kubelet[2450]: I1216 12:18:55.201660 2450 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 12:18:55.201694 kubelet[2450]: I1216 12:18:55.201692 2450 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 12:18:55.203997 kubelet[2450]: I1216 12:18:55.203084 2450 policy_none.go:47] "Start" Dec 16 12:18:55.204000 audit[2477]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2477 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:55.204000 audit[2477]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=fffffe14aba0 a2=0 a3=0 items=0 ppid=2450 pid=2477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.204000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Dec 16 12:18:55.207118 kubelet[2450]: I1216 12:18:55.207067 2450 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 12:18:55.206000 audit[2479]: NETFILTER_CFG table=mangle:47 family=2 entries=1 op=nft_register_chain pid=2479 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:55.207000 audit[2478]: NETFILTER_CFG table=mangle:48 family=10 entries=2 op=nft_register_chain pid=2478 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:55.207000 audit[2478]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffca7c4900 a2=0 a3=0 items=0 ppid=2450 pid=2478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.207000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:18:55.206000 audit[2479]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe0788700 a2=0 a3=0 items=0 ppid=2450 pid=2479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.209684 kubelet[2450]: I1216 12:18:55.209638 2450 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 12:18:55.209684 kubelet[2450]: I1216 12:18:55.209669 2450 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 12:18:55.209742 kubelet[2450]: I1216 12:18:55.209695 2450 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 12:18:55.206000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:18:55.210717 kubelet[2450]: E1216 12:18:55.210490 2450 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:18:55.209000 audit[2481]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2481 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:55.209000 audit[2481]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffca9a7fa0 a2=0 a3=0 items=0 ppid=2450 pid=2481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.209000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:18:55.211494 kubelet[2450]: E1216 12:18:55.211460 2450 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://46.224.130.63:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.224.130.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 12:18:55.211000 audit[2483]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2483 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:55.214041 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 12:18:55.211000 audit[2483]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff6c4d1a0 a2=0 a3=0 items=0 ppid=2450 pid=2483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.211000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:18:55.213000 audit[2482]: NETFILTER_CFG table=nat:51 family=2 entries=1 op=nft_register_chain pid=2482 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:55.213000 audit[2482]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe51c4800 a2=0 a3=0 items=0 ppid=2450 pid=2482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.213000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:18:55.217000 audit[2484]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2484 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:18:55.217000 audit[2484]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffff291610 a2=0 a3=0 items=0 ppid=2450 pid=2484 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.217000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:18:55.218000 audit[2485]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2485 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:18:55.218000 audit[2485]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc1ef27f0 a2=0 a3=0 items=0 ppid=2450 pid=2485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:55.218000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:18:55.226263 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 12:18:55.232779 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 12:18:55.249727 kubelet[2450]: E1216 12:18:55.249670 2450 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:18:55.250993 kubelet[2450]: I1216 12:18:55.250959 2450 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:18:55.251247 kubelet[2450]: I1216 12:18:55.251191 2450 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:18:55.253965 kubelet[2450]: I1216 12:18:55.253371 2450 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:18:55.254428 kubelet[2450]: E1216 12:18:55.254404 2450 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:18:55.254695 kubelet[2450]: E1216 12:18:55.254677 2450 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-5-8fe0b910ae\" not found" Dec 16 12:18:55.329049 systemd[1]: Created slice kubepods-burstable-podb96525c743ae3bc6e9d95d0d57c3e53f.slice - libcontainer container kubepods-burstable-podb96525c743ae3bc6e9d95d0d57c3e53f.slice. Dec 16 12:18:55.341886 kubelet[2450]: E1216 12:18:55.341502 2450 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.344774 systemd[1]: Created slice kubepods-burstable-pod3c72325168c2828afd25cc084a94b656.slice - libcontainer container kubepods-burstable-pod3c72325168c2828afd25cc084a94b656.slice. Dec 16 12:18:55.352882 kubelet[2450]: E1216 12:18:55.351716 2450 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.354776 kubelet[2450]: I1216 12:18:55.354747 2450 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.356998 systemd[1]: Created slice kubepods-burstable-podb496148d32baf531e2e18c8434d9fcaf.slice - libcontainer container kubepods-burstable-podb496148d32baf531e2e18c8434d9fcaf.slice. Dec 16 12:18:55.357926 kubelet[2450]: E1216 12:18:55.357893 2450 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.130.63:6443/api/v1/nodes\": dial tcp 46.224.130.63:6443: connect: connection refused" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.360669 kubelet[2450]: E1216 12:18:55.360616 2450 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.376186 kubelet[2450]: I1216 12:18:55.375987 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b96525c743ae3bc6e9d95d0d57c3e53f-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-5-8fe0b910ae\" (UID: \"b96525c743ae3bc6e9d95d0d57c3e53f\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.376186 kubelet[2450]: I1216 12:18:55.376082 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b96525c743ae3bc6e9d95d0d57c3e53f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-5-8fe0b910ae\" (UID: \"b96525c743ae3bc6e9d95d0d57c3e53f\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.376385 kubelet[2450]: I1216 12:18:55.376243 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.376481 kubelet[2450]: I1216 12:18:55.376384 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.376536 kubelet[2450]: I1216 12:18:55.376474 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.376656 kubelet[2450]: I1216 12:18:55.376572 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b96525c743ae3bc6e9d95d0d57c3e53f-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-5-8fe0b910ae\" (UID: \"b96525c743ae3bc6e9d95d0d57c3e53f\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.376772 kubelet[2450]: I1216 12:18:55.376667 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.377034 kubelet[2450]: I1216 12:18:55.376902 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.377034 kubelet[2450]: I1216 12:18:55.376956 2450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b496148d32baf531e2e18c8434d9fcaf-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-5-8fe0b910ae\" (UID: \"b496148d32baf531e2e18c8434d9fcaf\") " pod="kube-system/kube-scheduler-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.381094 kubelet[2450]: E1216 12:18:55.380947 2450 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.130.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-5-8fe0b910ae?timeout=10s\": dial tcp 46.224.130.63:6443: connect: connection refused" interval="400ms" Dec 16 12:18:55.561241 kubelet[2450]: I1216 12:18:55.561193 2450 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.561908 kubelet[2450]: E1216 12:18:55.561858 2450 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.130.63:6443/api/v1/nodes\": dial tcp 46.224.130.63:6443: connect: connection refused" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.646386 containerd[1576]: time="2025-12-16T12:18:55.646316319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-5-8fe0b910ae,Uid:b96525c743ae3bc6e9d95d0d57c3e53f,Namespace:kube-system,Attempt:0,}" Dec 16 12:18:55.655938 containerd[1576]: time="2025-12-16T12:18:55.655877290Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-5-8fe0b910ae,Uid:3c72325168c2828afd25cc084a94b656,Namespace:kube-system,Attempt:0,}" Dec 16 12:18:55.663379 containerd[1576]: time="2025-12-16T12:18:55.663329589Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-5-8fe0b910ae,Uid:b496148d32baf531e2e18c8434d9fcaf,Namespace:kube-system,Attempt:0,}" Dec 16 12:18:55.781864 kubelet[2450]: E1216 12:18:55.781823 2450 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.130.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-5-8fe0b910ae?timeout=10s\": dial tcp 46.224.130.63:6443: connect: connection refused" interval="800ms" Dec 16 12:18:55.965739 kubelet[2450]: I1216 12:18:55.965363 2450 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:55.966201 kubelet[2450]: E1216 12:18:55.966156 2450 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.130.63:6443/api/v1/nodes\": dial tcp 46.224.130.63:6443: connect: connection refused" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:56.168547 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1129362820.mount: Deactivated successfully. Dec 16 12:18:56.175110 containerd[1576]: time="2025-12-16T12:18:56.175031253Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:18:56.178253 containerd[1576]: time="2025-12-16T12:18:56.178162081Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:18:56.181317 containerd[1576]: time="2025-12-16T12:18:56.181244421Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:18:56.182806 containerd[1576]: time="2025-12-16T12:18:56.182748804Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:18:56.183602 containerd[1576]: time="2025-12-16T12:18:56.183546904Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:18:56.184504 containerd[1576]: time="2025-12-16T12:18:56.184380650Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:18:56.187655 containerd[1576]: time="2025-12-16T12:18:56.185696120Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:18:56.187655 containerd[1576]: time="2025-12-16T12:18:56.186735262Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:18:56.189666 containerd[1576]: time="2025-12-16T12:18:56.189084673Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 523.626369ms" Dec 16 12:18:56.190549 containerd[1576]: time="2025-12-16T12:18:56.190519564Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 540.381897ms" Dec 16 12:18:56.192166 containerd[1576]: time="2025-12-16T12:18:56.192123005Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 532.256696ms" Dec 16 12:18:56.220066 containerd[1576]: time="2025-12-16T12:18:56.219906187Z" level=info msg="connecting to shim d5b9a1c82489eb093dd4a3b14a9ae370c8078e144d631eefe3da5d9536ec4adc" address="unix:///run/containerd/s/d55f5c5067a148845440327ade58cfe1d2d343e3aaa22098c6a6de51ba5f04d6" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:56.232032 containerd[1576]: time="2025-12-16T12:18:56.231990422Z" level=info msg="connecting to shim 1c8f24b123b4362c50619b143e8fc2028b85acf2b9ee46b9c1001b839d4a0e34" address="unix:///run/containerd/s/84e54509565ef00bba7bcc13c030004b5a514bdfb098b90fcb33db410f3ad8ca" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:56.243059 containerd[1576]: time="2025-12-16T12:18:56.243010871Z" level=info msg="connecting to shim 2dc9032b1843f2e1e8089b687e7f696c4861dee1a05bfce81569647f89ec0454" address="unix:///run/containerd/s/6b1046f13ac1c35f6ecbe65361497e405f3160869526199bba89b8853e52f1a7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:56.244703 kubelet[2450]: E1216 12:18:56.244669 2450 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://46.224.130.63:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-5-8fe0b910ae&limit=500&resourceVersion=0\": dial tcp 46.224.130.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:18:56.269874 systemd[1]: Started cri-containerd-d5b9a1c82489eb093dd4a3b14a9ae370c8078e144d631eefe3da5d9536ec4adc.scope - libcontainer container d5b9a1c82489eb093dd4a3b14a9ae370c8078e144d631eefe3da5d9536ec4adc. Dec 16 12:18:56.278682 systemd[1]: Started cri-containerd-1c8f24b123b4362c50619b143e8fc2028b85acf2b9ee46b9c1001b839d4a0e34.scope - libcontainer container 1c8f24b123b4362c50619b143e8fc2028b85acf2b9ee46b9c1001b839d4a0e34. Dec 16 12:18:56.289040 kubelet[2450]: E1216 12:18:56.288562 2450 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://46.224.130.63:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.224.130.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 12:18:56.294883 systemd[1]: Started cri-containerd-2dc9032b1843f2e1e8089b687e7f696c4861dee1a05bfce81569647f89ec0454.scope - libcontainer container 2dc9032b1843f2e1e8089b687e7f696c4861dee1a05bfce81569647f89ec0454. Dec 16 12:18:56.296000 audit: BPF prog-id=81 op=LOAD Dec 16 12:18:56.297000 audit: BPF prog-id=82 op=LOAD Dec 16 12:18:56.297000 audit[2529]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c180 a2=98 a3=0 items=0 ppid=2499 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435623961316338323438396562303933646434613362313461396165 Dec 16 12:18:56.297000 audit: BPF prog-id=82 op=UNLOAD Dec 16 12:18:56.297000 audit[2529]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2499 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435623961316338323438396562303933646434613362313461396165 Dec 16 12:18:56.297000 audit: BPF prog-id=83 op=LOAD Dec 16 12:18:56.297000 audit[2529]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c3e8 a2=98 a3=0 items=0 ppid=2499 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435623961316338323438396562303933646434613362313461396165 Dec 16 12:18:56.297000 audit: BPF prog-id=84 op=LOAD Dec 16 12:18:56.297000 audit[2529]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=400010c168 a2=98 a3=0 items=0 ppid=2499 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435623961316338323438396562303933646434613362313461396165 Dec 16 12:18:56.297000 audit: BPF prog-id=84 op=UNLOAD Dec 16 12:18:56.297000 audit[2529]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2499 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435623961316338323438396562303933646434613362313461396165 Dec 16 12:18:56.297000 audit: BPF prog-id=83 op=UNLOAD Dec 16 12:18:56.297000 audit[2529]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2499 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435623961316338323438396562303933646434613362313461396165 Dec 16 12:18:56.297000 audit: BPF prog-id=85 op=LOAD Dec 16 12:18:56.297000 audit[2529]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c648 a2=98 a3=0 items=0 ppid=2499 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.297000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435623961316338323438396562303933646434613362313461396165 Dec 16 12:18:56.306000 audit: BPF prog-id=86 op=LOAD Dec 16 12:18:56.308000 audit: BPF prog-id=87 op=LOAD Dec 16 12:18:56.308000 audit[2549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2510 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163386632346231323362343336326335303631396231343365386663 Dec 16 12:18:56.308000 audit: BPF prog-id=87 op=UNLOAD Dec 16 12:18:56.308000 audit[2549]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2510 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163386632346231323362343336326335303631396231343365386663 Dec 16 12:18:56.308000 audit: BPF prog-id=88 op=LOAD Dec 16 12:18:56.308000 audit[2549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2510 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163386632346231323362343336326335303631396231343365386663 Dec 16 12:18:56.308000 audit: BPF prog-id=89 op=LOAD Dec 16 12:18:56.308000 audit[2549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2510 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163386632346231323362343336326335303631396231343365386663 Dec 16 12:18:56.308000 audit: BPF prog-id=89 op=UNLOAD Dec 16 12:18:56.308000 audit[2549]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2510 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163386632346231323362343336326335303631396231343365386663 Dec 16 12:18:56.308000 audit: BPF prog-id=88 op=UNLOAD Dec 16 12:18:56.308000 audit[2549]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2510 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163386632346231323362343336326335303631396231343365386663 Dec 16 12:18:56.308000 audit: BPF prog-id=90 op=LOAD Dec 16 12:18:56.308000 audit[2549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2510 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163386632346231323362343336326335303631396231343365386663 Dec 16 12:18:56.313000 audit: BPF prog-id=91 op=LOAD Dec 16 12:18:56.315000 audit: BPF prog-id=92 op=LOAD Dec 16 12:18:56.315000 audit[2561]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2534 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.315000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264633930333262313834336632653165383038396236383765376636 Dec 16 12:18:56.315000 audit: BPF prog-id=92 op=UNLOAD Dec 16 12:18:56.315000 audit[2561]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.315000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264633930333262313834336632653165383038396236383765376636 Dec 16 12:18:56.316000 audit: BPF prog-id=93 op=LOAD Dec 16 12:18:56.316000 audit[2561]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2534 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.316000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264633930333262313834336632653165383038396236383765376636 Dec 16 12:18:56.316000 audit: BPF prog-id=94 op=LOAD Dec 16 12:18:56.316000 audit[2561]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2534 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.316000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264633930333262313834336632653165383038396236383765376636 Dec 16 12:18:56.316000 audit: BPF prog-id=94 op=UNLOAD Dec 16 12:18:56.316000 audit[2561]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.316000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264633930333262313834336632653165383038396236383765376636 Dec 16 12:18:56.316000 audit: BPF prog-id=93 op=UNLOAD Dec 16 12:18:56.316000 audit[2561]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.316000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264633930333262313834336632653165383038396236383765376636 Dec 16 12:18:56.316000 audit: BPF prog-id=95 op=LOAD Dec 16 12:18:56.316000 audit[2561]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2534 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.316000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3264633930333262313834336632653165383038396236383765376636 Dec 16 12:18:56.352007 containerd[1576]: time="2025-12-16T12:18:56.351963540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-5-8fe0b910ae,Uid:b96525c743ae3bc6e9d95d0d57c3e53f,Namespace:kube-system,Attempt:0,} returns sandbox id \"d5b9a1c82489eb093dd4a3b14a9ae370c8078e144d631eefe3da5d9536ec4adc\"" Dec 16 12:18:56.362036 containerd[1576]: time="2025-12-16T12:18:56.361939806Z" level=info msg="CreateContainer within sandbox \"d5b9a1c82489eb093dd4a3b14a9ae370c8078e144d631eefe3da5d9536ec4adc\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 12:18:56.365522 containerd[1576]: time="2025-12-16T12:18:56.365480865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-5-8fe0b910ae,Uid:b496148d32baf531e2e18c8434d9fcaf,Namespace:kube-system,Attempt:0,} returns sandbox id \"1c8f24b123b4362c50619b143e8fc2028b85acf2b9ee46b9c1001b839d4a0e34\"" Dec 16 12:18:56.372404 containerd[1576]: time="2025-12-16T12:18:56.372360550Z" level=info msg="CreateContainer within sandbox \"1c8f24b123b4362c50619b143e8fc2028b85acf2b9ee46b9c1001b839d4a0e34\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 12:18:56.375775 containerd[1576]: time="2025-12-16T12:18:56.375710376Z" level=info msg="Container 060b2099e39d0737c31d2deda2a79ff67edc22c662229e1866b78b7e46848e2a: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:18:56.386265 containerd[1576]: time="2025-12-16T12:18:56.385594666Z" level=info msg="CreateContainer within sandbox \"d5b9a1c82489eb093dd4a3b14a9ae370c8078e144d631eefe3da5d9536ec4adc\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"060b2099e39d0737c31d2deda2a79ff67edc22c662229e1866b78b7e46848e2a\"" Dec 16 12:18:56.387479 containerd[1576]: time="2025-12-16T12:18:56.387426666Z" level=info msg="StartContainer for \"060b2099e39d0737c31d2deda2a79ff67edc22c662229e1866b78b7e46848e2a\"" Dec 16 12:18:56.388205 containerd[1576]: time="2025-12-16T12:18:56.388148313Z" level=info msg="Container 73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:18:56.389338 containerd[1576]: time="2025-12-16T12:18:56.389287992Z" level=info msg="connecting to shim 060b2099e39d0737c31d2deda2a79ff67edc22c662229e1866b78b7e46848e2a" address="unix:///run/containerd/s/d55f5c5067a148845440327ade58cfe1d2d343e3aaa22098c6a6de51ba5f04d6" protocol=ttrpc version=3 Dec 16 12:18:56.390582 containerd[1576]: time="2025-12-16T12:18:56.390529569Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-5-8fe0b910ae,Uid:3c72325168c2828afd25cc084a94b656,Namespace:kube-system,Attempt:0,} returns sandbox id \"2dc9032b1843f2e1e8089b687e7f696c4861dee1a05bfce81569647f89ec0454\"" Dec 16 12:18:56.399327 containerd[1576]: time="2025-12-16T12:18:56.399260658Z" level=info msg="CreateContainer within sandbox \"2dc9032b1843f2e1e8089b687e7f696c4861dee1a05bfce81569647f89ec0454\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 12:18:56.402218 containerd[1576]: time="2025-12-16T12:18:56.402171487Z" level=info msg="CreateContainer within sandbox \"1c8f24b123b4362c50619b143e8fc2028b85acf2b9ee46b9c1001b839d4a0e34\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10\"" Dec 16 12:18:56.403478 containerd[1576]: time="2025-12-16T12:18:56.403450951Z" level=info msg="StartContainer for \"73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10\"" Dec 16 12:18:56.410480 containerd[1576]: time="2025-12-16T12:18:56.409161750Z" level=info msg="connecting to shim 73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10" address="unix:///run/containerd/s/84e54509565ef00bba7bcc13c030004b5a514bdfb098b90fcb33db410f3ad8ca" protocol=ttrpc version=3 Dec 16 12:18:56.414321 containerd[1576]: time="2025-12-16T12:18:56.414261843Z" level=info msg="Container a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:18:56.415535 systemd[1]: Started cri-containerd-060b2099e39d0737c31d2deda2a79ff67edc22c662229e1866b78b7e46848e2a.scope - libcontainer container 060b2099e39d0737c31d2deda2a79ff67edc22c662229e1866b78b7e46848e2a. Dec 16 12:18:56.433560 containerd[1576]: time="2025-12-16T12:18:56.433511972Z" level=info msg="CreateContainer within sandbox \"2dc9032b1843f2e1e8089b687e7f696c4861dee1a05bfce81569647f89ec0454\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e\"" Dec 16 12:18:56.435234 containerd[1576]: time="2025-12-16T12:18:56.435126535Z" level=info msg="StartContainer for \"a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e\"" Dec 16 12:18:56.436581 containerd[1576]: time="2025-12-16T12:18:56.436337307Z" level=info msg="connecting to shim a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e" address="unix:///run/containerd/s/6b1046f13ac1c35f6ecbe65361497e405f3160869526199bba89b8853e52f1a7" protocol=ttrpc version=3 Dec 16 12:18:56.443378 systemd[1]: Started cri-containerd-73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10.scope - libcontainer container 73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10. Dec 16 12:18:56.449000 audit: BPF prog-id=96 op=LOAD Dec 16 12:18:56.451000 audit: BPF prog-id=97 op=LOAD Dec 16 12:18:56.451000 audit[2626]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2499 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036306232303939653339643037333763333164326465646132613739 Dec 16 12:18:56.452000 audit: BPF prog-id=97 op=UNLOAD Dec 16 12:18:56.452000 audit[2626]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2499 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.452000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036306232303939653339643037333763333164326465646132613739 Dec 16 12:18:56.453000 audit: BPF prog-id=98 op=LOAD Dec 16 12:18:56.453000 audit[2626]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2499 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.453000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036306232303939653339643037333763333164326465646132613739 Dec 16 12:18:56.453000 audit: BPF prog-id=99 op=LOAD Dec 16 12:18:56.453000 audit[2626]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2499 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.453000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036306232303939653339643037333763333164326465646132613739 Dec 16 12:18:56.453000 audit: BPF prog-id=99 op=UNLOAD Dec 16 12:18:56.453000 audit[2626]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2499 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.453000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036306232303939653339643037333763333164326465646132613739 Dec 16 12:18:56.453000 audit: BPF prog-id=98 op=UNLOAD Dec 16 12:18:56.453000 audit[2626]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2499 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.453000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036306232303939653339643037333763333164326465646132613739 Dec 16 12:18:56.453000 audit: BPF prog-id=100 op=LOAD Dec 16 12:18:56.453000 audit[2626]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2499 pid=2626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.453000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036306232303939653339643037333763333164326465646132613739 Dec 16 12:18:56.462240 systemd[1]: Started cri-containerd-a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e.scope - libcontainer container a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e. Dec 16 12:18:56.466069 kubelet[2450]: E1216 12:18:56.466032 2450 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://46.224.130.63:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.224.130.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:18:56.476000 audit: BPF prog-id=101 op=LOAD Dec 16 12:18:56.477000 audit: BPF prog-id=102 op=LOAD Dec 16 12:18:56.477000 audit[2638]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2510 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.477000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733663663356564323832363031383033396233353731353035346161 Dec 16 12:18:56.482000 audit: BPF prog-id=102 op=UNLOAD Dec 16 12:18:56.482000 audit[2638]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2510 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733663663356564323832363031383033396233353731353035346161 Dec 16 12:18:56.482000 audit: BPF prog-id=103 op=LOAD Dec 16 12:18:56.482000 audit[2638]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2510 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733663663356564323832363031383033396233353731353035346161 Dec 16 12:18:56.483000 audit: BPF prog-id=104 op=LOAD Dec 16 12:18:56.483000 audit[2638]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=2510 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733663663356564323832363031383033396233353731353035346161 Dec 16 12:18:56.483000 audit: BPF prog-id=104 op=UNLOAD Dec 16 12:18:56.483000 audit[2638]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2510 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733663663356564323832363031383033396233353731353035346161 Dec 16 12:18:56.486000 audit: BPF prog-id=103 op=UNLOAD Dec 16 12:18:56.486000 audit[2638]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2510 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.486000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733663663356564323832363031383033396233353731353035346161 Dec 16 12:18:56.486000 audit: BPF prog-id=105 op=LOAD Dec 16 12:18:56.486000 audit[2638]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=2510 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.486000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733663663356564323832363031383033396233353731353035346161 Dec 16 12:18:56.491000 audit: BPF prog-id=106 op=LOAD Dec 16 12:18:56.495000 audit: BPF prog-id=107 op=LOAD Dec 16 12:18:56.495000 audit[2659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2534 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131646265346532363738386135313531333538653763313632613133 Dec 16 12:18:56.495000 audit: BPF prog-id=107 op=UNLOAD Dec 16 12:18:56.495000 audit[2659]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131646265346532363738386135313531333538653763313632613133 Dec 16 12:18:56.496000 audit: BPF prog-id=108 op=LOAD Dec 16 12:18:56.496000 audit[2659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2534 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131646265346532363738386135313531333538653763313632613133 Dec 16 12:18:56.496000 audit: BPF prog-id=109 op=LOAD Dec 16 12:18:56.496000 audit[2659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2534 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131646265346532363738386135313531333538653763313632613133 Dec 16 12:18:56.496000 audit: BPF prog-id=109 op=UNLOAD Dec 16 12:18:56.496000 audit[2659]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131646265346532363738386135313531333538653763313632613133 Dec 16 12:18:56.496000 audit: BPF prog-id=108 op=UNLOAD Dec 16 12:18:56.496000 audit[2659]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131646265346532363738386135313531333538653763313632613133 Dec 16 12:18:56.496000 audit: BPF prog-id=110 op=LOAD Dec 16 12:18:56.496000 audit[2659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2534 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:56.496000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131646265346532363738386135313531333538653763313632613133 Dec 16 12:18:56.538442 containerd[1576]: time="2025-12-16T12:18:56.538354482Z" level=info msg="StartContainer for \"060b2099e39d0737c31d2deda2a79ff67edc22c662229e1866b78b7e46848e2a\" returns successfully" Dec 16 12:18:56.545590 containerd[1576]: time="2025-12-16T12:18:56.545486530Z" level=info msg="StartContainer for \"73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10\" returns successfully" Dec 16 12:18:56.566007 containerd[1576]: time="2025-12-16T12:18:56.565946951Z" level=info msg="StartContainer for \"a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e\" returns successfully" Dec 16 12:18:56.584135 kubelet[2450]: E1216 12:18:56.584085 2450 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.130.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-5-8fe0b910ae?timeout=10s\": dial tcp 46.224.130.63:6443: connect: connection refused" interval="1.6s" Dec 16 12:18:56.638910 kubelet[2450]: E1216 12:18:56.638864 2450 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://46.224.130.63:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.224.130.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:18:56.768489 kubelet[2450]: I1216 12:18:56.768356 2450 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:57.229964 kubelet[2450]: E1216 12:18:57.229924 2450 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:57.241419 kubelet[2450]: E1216 12:18:57.241228 2450 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:57.245737 kubelet[2450]: E1216 12:18:57.245706 2450 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:58.248195 kubelet[2450]: E1216 12:18:58.248152 2450 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:58.249932 kubelet[2450]: E1216 12:18:58.249897 2450 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:59.366126 kubelet[2450]: I1216 12:18:59.365417 2450 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:59.380666 kubelet[2450]: I1216 12:18:59.380601 2450 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:59.426821 kubelet[2450]: E1216 12:18:59.426743 2450 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:59.426821 kubelet[2450]: I1216 12:18:59.426797 2450 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:59.429370 kubelet[2450]: E1216 12:18:59.429323 2450 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-5-8fe0b910ae\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:59.429370 kubelet[2450]: I1216 12:18:59.429360 2450 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:18:59.431613 kubelet[2450]: E1216 12:18:59.431575 2450 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-5-8fe0b910ae\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:00.159834 kubelet[2450]: I1216 12:19:00.159779 2450 apiserver.go:52] "Watching apiserver" Dec 16 12:19:00.175556 kubelet[2450]: I1216 12:19:00.175507 2450 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 12:19:01.674791 systemd[1]: Reload requested from client PID 2729 ('systemctl') (unit session-8.scope)... Dec 16 12:19:01.674809 systemd[1]: Reloading... Dec 16 12:19:01.775674 zram_generator::config[2776]: No configuration found. Dec 16 12:19:02.013761 systemd[1]: Reloading finished in 338 ms. Dec 16 12:19:02.037283 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:19:02.053038 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 12:19:02.053407 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:19:02.056396 kernel: kauditd_printk_skb: 202 callbacks suppressed Dec 16 12:19:02.056455 kernel: audit: type=1131 audit(1765887542.051:387): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:02.051000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:02.053488 systemd[1]: kubelet.service: Consumed 1.894s CPU time, 121.6M memory peak. Dec 16 12:19:02.058301 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:19:02.060851 kernel: audit: type=1334 audit(1765887542.057:388): prog-id=111 op=LOAD Dec 16 12:19:02.060933 kernel: audit: type=1334 audit(1765887542.057:389): prog-id=77 op=UNLOAD Dec 16 12:19:02.057000 audit: BPF prog-id=111 op=LOAD Dec 16 12:19:02.057000 audit: BPF prog-id=77 op=UNLOAD Dec 16 12:19:02.061000 audit: BPF prog-id=112 op=LOAD Dec 16 12:19:02.061000 audit: BPF prog-id=113 op=LOAD Dec 16 12:19:02.061000 audit: BPF prog-id=67 op=UNLOAD Dec 16 12:19:02.061000 audit: BPF prog-id=68 op=UNLOAD Dec 16 12:19:02.065651 kernel: audit: type=1334 audit(1765887542.061:390): prog-id=112 op=LOAD Dec 16 12:19:02.065708 kernel: audit: type=1334 audit(1765887542.061:391): prog-id=113 op=LOAD Dec 16 12:19:02.065736 kernel: audit: type=1334 audit(1765887542.061:392): prog-id=67 op=UNLOAD Dec 16 12:19:02.065761 kernel: audit: type=1334 audit(1765887542.061:393): prog-id=68 op=UNLOAD Dec 16 12:19:02.065000 audit: BPF prog-id=114 op=LOAD Dec 16 12:19:02.068724 kernel: audit: type=1334 audit(1765887542.065:394): prog-id=114 op=LOAD Dec 16 12:19:02.068839 kernel: audit: type=1334 audit(1765887542.065:395): prog-id=61 op=UNLOAD Dec 16 12:19:02.068871 kernel: audit: type=1334 audit(1765887542.067:396): prog-id=115 op=LOAD Dec 16 12:19:02.065000 audit: BPF prog-id=61 op=UNLOAD Dec 16 12:19:02.067000 audit: BPF prog-id=115 op=LOAD Dec 16 12:19:02.067000 audit: BPF prog-id=116 op=LOAD Dec 16 12:19:02.067000 audit: BPF prog-id=62 op=UNLOAD Dec 16 12:19:02.067000 audit: BPF prog-id=63 op=UNLOAD Dec 16 12:19:02.067000 audit: BPF prog-id=117 op=LOAD Dec 16 12:19:02.072000 audit: BPF prog-id=64 op=UNLOAD Dec 16 12:19:02.072000 audit: BPF prog-id=118 op=LOAD Dec 16 12:19:02.072000 audit: BPF prog-id=119 op=LOAD Dec 16 12:19:02.072000 audit: BPF prog-id=65 op=UNLOAD Dec 16 12:19:02.072000 audit: BPF prog-id=66 op=UNLOAD Dec 16 12:19:02.074000 audit: BPF prog-id=120 op=LOAD Dec 16 12:19:02.074000 audit: BPF prog-id=70 op=UNLOAD Dec 16 12:19:02.075000 audit: BPF prog-id=121 op=LOAD Dec 16 12:19:02.075000 audit: BPF prog-id=71 op=UNLOAD Dec 16 12:19:02.075000 audit: BPF prog-id=122 op=LOAD Dec 16 12:19:02.075000 audit: BPF prog-id=123 op=LOAD Dec 16 12:19:02.075000 audit: BPF prog-id=72 op=UNLOAD Dec 16 12:19:02.075000 audit: BPF prog-id=73 op=UNLOAD Dec 16 12:19:02.077000 audit: BPF prog-id=124 op=LOAD Dec 16 12:19:02.077000 audit: BPF prog-id=69 op=UNLOAD Dec 16 12:19:02.078000 audit: BPF prog-id=125 op=LOAD Dec 16 12:19:02.078000 audit: BPF prog-id=78 op=UNLOAD Dec 16 12:19:02.078000 audit: BPF prog-id=126 op=LOAD Dec 16 12:19:02.078000 audit: BPF prog-id=127 op=LOAD Dec 16 12:19:02.078000 audit: BPF prog-id=79 op=UNLOAD Dec 16 12:19:02.078000 audit: BPF prog-id=80 op=UNLOAD Dec 16 12:19:02.080000 audit: BPF prog-id=128 op=LOAD Dec 16 12:19:02.080000 audit: BPF prog-id=74 op=UNLOAD Dec 16 12:19:02.080000 audit: BPF prog-id=129 op=LOAD Dec 16 12:19:02.080000 audit: BPF prog-id=130 op=LOAD Dec 16 12:19:02.080000 audit: BPF prog-id=75 op=UNLOAD Dec 16 12:19:02.080000 audit: BPF prog-id=76 op=UNLOAD Dec 16 12:19:02.246211 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:19:02.244000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:02.259001 (kubelet)[2821]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:19:02.331808 kubelet[2821]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:19:02.331808 kubelet[2821]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:19:02.332881 kubelet[2821]: I1216 12:19:02.332655 2821 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:19:02.345666 kubelet[2821]: I1216 12:19:02.345503 2821 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 12:19:02.345666 kubelet[2821]: I1216 12:19:02.345554 2821 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:19:02.345666 kubelet[2821]: I1216 12:19:02.345604 2821 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 12:19:02.345666 kubelet[2821]: I1216 12:19:02.345611 2821 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:19:02.346843 kubelet[2821]: I1216 12:19:02.346014 2821 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:19:02.347574 kubelet[2821]: I1216 12:19:02.347479 2821 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 12:19:02.350176 kubelet[2821]: I1216 12:19:02.350124 2821 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:19:02.355845 kubelet[2821]: I1216 12:19:02.355812 2821 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:19:02.360424 kubelet[2821]: I1216 12:19:02.360349 2821 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 12:19:02.360637 kubelet[2821]: I1216 12:19:02.360594 2821 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:19:02.360901 kubelet[2821]: I1216 12:19:02.360648 2821 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-5-8fe0b910ae","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:19:02.360901 kubelet[2821]: I1216 12:19:02.360896 2821 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:19:02.360901 kubelet[2821]: I1216 12:19:02.360906 2821 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 12:19:02.361189 kubelet[2821]: I1216 12:19:02.360932 2821 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 12:19:02.362532 kubelet[2821]: I1216 12:19:02.362485 2821 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:19:02.362744 kubelet[2821]: I1216 12:19:02.362712 2821 kubelet.go:475] "Attempting to sync node with API server" Dec 16 12:19:02.362744 kubelet[2821]: I1216 12:19:02.362732 2821 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:19:02.362835 kubelet[2821]: I1216 12:19:02.362775 2821 kubelet.go:387] "Adding apiserver pod source" Dec 16 12:19:02.362835 kubelet[2821]: I1216 12:19:02.362792 2821 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:19:02.365659 kubelet[2821]: I1216 12:19:02.364855 2821 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:19:02.365659 kubelet[2821]: I1216 12:19:02.365641 2821 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:19:02.365820 kubelet[2821]: I1216 12:19:02.365676 2821 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 12:19:02.376884 kubelet[2821]: I1216 12:19:02.376847 2821 server.go:1262] "Started kubelet" Dec 16 12:19:02.382680 kubelet[2821]: I1216 12:19:02.377037 2821 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:19:02.382680 kubelet[2821]: I1216 12:19:02.382295 2821 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 12:19:02.382680 kubelet[2821]: I1216 12:19:02.378414 2821 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:19:02.390374 kubelet[2821]: I1216 12:19:02.390336 2821 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:19:02.391647 kubelet[2821]: I1216 12:19:02.390745 2821 server.go:310] "Adding debug handlers to kubelet server" Dec 16 12:19:02.391647 kubelet[2821]: I1216 12:19:02.378384 2821 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:19:02.391975 kubelet[2821]: I1216 12:19:02.378560 2821 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:19:02.392478 kubelet[2821]: I1216 12:19:02.392094 2821 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 12:19:02.392549 kubelet[2821]: E1216 12:19:02.392483 2821 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-5-8fe0b910ae\" not found" Dec 16 12:19:02.393857 kubelet[2821]: I1216 12:19:02.393812 2821 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 12:19:02.393970 kubelet[2821]: I1216 12:19:02.393951 2821 reconciler.go:29] "Reconciler: start to sync state" Dec 16 12:19:02.414801 kubelet[2821]: I1216 12:19:02.414735 2821 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 12:19:02.425711 kubelet[2821]: I1216 12:19:02.424356 2821 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:19:02.425711 kubelet[2821]: I1216 12:19:02.424384 2821 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:19:02.427650 kubelet[2821]: I1216 12:19:02.427575 2821 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:19:02.431833 kubelet[2821]: E1216 12:19:02.431653 2821 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:19:02.438578 kubelet[2821]: I1216 12:19:02.438523 2821 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 12:19:02.438578 kubelet[2821]: I1216 12:19:02.438559 2821 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 12:19:02.438578 kubelet[2821]: I1216 12:19:02.438581 2821 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 12:19:02.441829 kubelet[2821]: E1216 12:19:02.438699 2821 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:19:02.499866 kubelet[2821]: I1216 12:19:02.499812 2821 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:19:02.499866 kubelet[2821]: I1216 12:19:02.499838 2821 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:19:02.499866 kubelet[2821]: I1216 12:19:02.499875 2821 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:19:02.500171 kubelet[2821]: I1216 12:19:02.500088 2821 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 12:19:02.500171 kubelet[2821]: I1216 12:19:02.500105 2821 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 12:19:02.500171 kubelet[2821]: I1216 12:19:02.500133 2821 policy_none.go:49] "None policy: Start" Dec 16 12:19:02.500171 kubelet[2821]: I1216 12:19:02.500147 2821 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 12:19:02.500171 kubelet[2821]: I1216 12:19:02.500388 2821 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 12:19:02.500881 kubelet[2821]: I1216 12:19:02.500545 2821 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 16 12:19:02.500881 kubelet[2821]: I1216 12:19:02.500558 2821 policy_none.go:47] "Start" Dec 16 12:19:02.506957 kubelet[2821]: E1216 12:19:02.506926 2821 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:19:02.507599 kubelet[2821]: I1216 12:19:02.507575 2821 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:19:02.507772 kubelet[2821]: I1216 12:19:02.507734 2821 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:19:02.508555 kubelet[2821]: I1216 12:19:02.508531 2821 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:19:02.510145 kubelet[2821]: E1216 12:19:02.510118 2821 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:19:02.540366 kubelet[2821]: I1216 12:19:02.540323 2821 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.540871 kubelet[2821]: I1216 12:19:02.540838 2821 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.542452 kubelet[2821]: I1216 12:19:02.542414 2821 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.596274 kubelet[2821]: I1216 12:19:02.595527 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b96525c743ae3bc6e9d95d0d57c3e53f-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-5-8fe0b910ae\" (UID: \"b96525c743ae3bc6e9d95d0d57c3e53f\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.596274 kubelet[2821]: I1216 12:19:02.595963 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.596274 kubelet[2821]: I1216 12:19:02.595990 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b96525c743ae3bc6e9d95d0d57c3e53f-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-5-8fe0b910ae\" (UID: \"b96525c743ae3bc6e9d95d0d57c3e53f\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.596274 kubelet[2821]: I1216 12:19:02.596013 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b96525c743ae3bc6e9d95d0d57c3e53f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-5-8fe0b910ae\" (UID: \"b96525c743ae3bc6e9d95d0d57c3e53f\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.596274 kubelet[2821]: I1216 12:19:02.596036 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.596658 kubelet[2821]: I1216 12:19:02.596056 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.596658 kubelet[2821]: I1216 12:19:02.596120 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.596658 kubelet[2821]: I1216 12:19:02.596185 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3c72325168c2828afd25cc084a94b656-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-5-8fe0b910ae\" (UID: \"3c72325168c2828afd25cc084a94b656\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.596658 kubelet[2821]: I1216 12:19:02.596217 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b496148d32baf531e2e18c8434d9fcaf-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-5-8fe0b910ae\" (UID: \"b496148d32baf531e2e18c8434d9fcaf\") " pod="kube-system/kube-scheduler-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.623834 kubelet[2821]: I1216 12:19:02.623788 2821 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.635017 kubelet[2821]: I1216 12:19:02.634962 2821 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:02.635687 kubelet[2821]: I1216 12:19:02.635300 2821 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:03.370673 kubelet[2821]: I1216 12:19:03.370015 2821 apiserver.go:52] "Watching apiserver" Dec 16 12:19:03.394965 kubelet[2821]: I1216 12:19:03.394915 2821 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 12:19:03.493099 kubelet[2821]: I1216 12:19:03.492194 2821 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:03.508829 kubelet[2821]: E1216 12:19:03.508693 2821 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-5-8fe0b910ae\" already exists" pod="kube-system/kube-scheduler-ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:03.548642 kubelet[2821]: I1216 12:19:03.548551 2821 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-5-8fe0b910ae" podStartSLOduration=1.54853341 podStartE2EDuration="1.54853341s" podCreationTimestamp="2025-12-16 12:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:19:03.54852853 +0000 UTC m=+1.284098412" watchObservedRunningTime="2025-12-16 12:19:03.54853341 +0000 UTC m=+1.284103252" Dec 16 12:19:03.548994 kubelet[2821]: I1216 12:19:03.548921 2821 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-5-8fe0b910ae" podStartSLOduration=1.5488941330000001 podStartE2EDuration="1.548894133s" podCreationTimestamp="2025-12-16 12:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:19:03.530554377 +0000 UTC m=+1.266124259" watchObservedRunningTime="2025-12-16 12:19:03.548894133 +0000 UTC m=+1.284464015" Dec 16 12:19:06.410346 kubelet[2821]: I1216 12:19:06.409982 2821 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-5-8fe0b910ae" podStartSLOduration=4.40995704 podStartE2EDuration="4.40995704s" podCreationTimestamp="2025-12-16 12:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:19:03.559640341 +0000 UTC m=+1.295210223" watchObservedRunningTime="2025-12-16 12:19:06.40995704 +0000 UTC m=+4.145526922" Dec 16 12:19:07.365269 kubelet[2821]: I1216 12:19:07.365216 2821 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 12:19:07.365755 containerd[1576]: time="2025-12-16T12:19:07.365714198Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 12:19:07.366560 kubelet[2821]: I1216 12:19:07.366502 2821 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 12:19:08.347853 systemd[1]: Created slice kubepods-besteffort-pod38dbc406_75e4_46ee_bcb4_aafa6c59bf4e.slice - libcontainer container kubepods-besteffort-pod38dbc406_75e4_46ee_bcb4_aafa6c59bf4e.slice. Dec 16 12:19:08.438266 kubelet[2821]: I1216 12:19:08.438121 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/38dbc406-75e4-46ee-bcb4-aafa6c59bf4e-kube-proxy\") pod \"kube-proxy-8tczh\" (UID: \"38dbc406-75e4-46ee-bcb4-aafa6c59bf4e\") " pod="kube-system/kube-proxy-8tczh" Dec 16 12:19:08.438266 kubelet[2821]: I1216 12:19:08.438166 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/38dbc406-75e4-46ee-bcb4-aafa6c59bf4e-xtables-lock\") pod \"kube-proxy-8tczh\" (UID: \"38dbc406-75e4-46ee-bcb4-aafa6c59bf4e\") " pod="kube-system/kube-proxy-8tczh" Dec 16 12:19:08.438266 kubelet[2821]: I1216 12:19:08.438196 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxqdv\" (UniqueName: \"kubernetes.io/projected/38dbc406-75e4-46ee-bcb4-aafa6c59bf4e-kube-api-access-zxqdv\") pod \"kube-proxy-8tczh\" (UID: \"38dbc406-75e4-46ee-bcb4-aafa6c59bf4e\") " pod="kube-system/kube-proxy-8tczh" Dec 16 12:19:08.438266 kubelet[2821]: I1216 12:19:08.438216 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38dbc406-75e4-46ee-bcb4-aafa6c59bf4e-lib-modules\") pod \"kube-proxy-8tczh\" (UID: \"38dbc406-75e4-46ee-bcb4-aafa6c59bf4e\") " pod="kube-system/kube-proxy-8tczh" Dec 16 12:19:08.471416 systemd[1]: Created slice kubepods-besteffort-pod0c01355f_9665_410a_9a50_e4885ddd296a.slice - libcontainer container kubepods-besteffort-pod0c01355f_9665_410a_9a50_e4885ddd296a.slice. Dec 16 12:19:08.538730 kubelet[2821]: I1216 12:19:08.538679 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0c01355f-9665-410a-9a50-e4885ddd296a-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-n64mv\" (UID: \"0c01355f-9665-410a-9a50-e4885ddd296a\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-n64mv" Dec 16 12:19:08.539658 kubelet[2821]: I1216 12:19:08.539433 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjldt\" (UniqueName: \"kubernetes.io/projected/0c01355f-9665-410a-9a50-e4885ddd296a-kube-api-access-wjldt\") pod \"tigera-operator-65cdcdfd6d-n64mv\" (UID: \"0c01355f-9665-410a-9a50-e4885ddd296a\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-n64mv" Dec 16 12:19:08.665529 containerd[1576]: time="2025-12-16T12:19:08.665447221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8tczh,Uid:38dbc406-75e4-46ee-bcb4-aafa6c59bf4e,Namespace:kube-system,Attempt:0,}" Dec 16 12:19:08.694170 containerd[1576]: time="2025-12-16T12:19:08.693882628Z" level=info msg="connecting to shim 992ac8291aec22e8011b6ebff93693a14ffe636b63c44392b7e6a1c63c399806" address="unix:///run/containerd/s/b989dd968c23b0fec91c12bdec52b6eff0930df0fce677c3192447324aef0063" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:08.722925 systemd[1]: Started cri-containerd-992ac8291aec22e8011b6ebff93693a14ffe636b63c44392b7e6a1c63c399806.scope - libcontainer container 992ac8291aec22e8011b6ebff93693a14ffe636b63c44392b7e6a1c63c399806. Dec 16 12:19:08.737401 kernel: kauditd_printk_skb: 32 callbacks suppressed Dec 16 12:19:08.737531 kernel: audit: type=1334 audit(1765887548.735:429): prog-id=131 op=LOAD Dec 16 12:19:08.735000 audit: BPF prog-id=131 op=LOAD Dec 16 12:19:08.737000 audit: BPF prog-id=132 op=LOAD Dec 16 12:19:08.738892 kernel: audit: type=1334 audit(1765887548.737:430): prog-id=132 op=LOAD Dec 16 12:19:08.737000 audit[2893]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.741195 kernel: audit: type=1300 audit(1765887548.737:430): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.737000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.743451 kernel: audit: type=1327 audit(1765887548.737:430): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.737000 audit: BPF prog-id=132 op=UNLOAD Dec 16 12:19:08.737000 audit[2893]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.746440 kernel: audit: type=1334 audit(1765887548.737:431): prog-id=132 op=UNLOAD Dec 16 12:19:08.746569 kernel: audit: type=1300 audit(1765887548.737:431): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.737000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.748869 kernel: audit: type=1327 audit(1765887548.737:431): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.737000 audit: BPF prog-id=133 op=LOAD Dec 16 12:19:08.749769 kernel: audit: type=1334 audit(1765887548.737:432): prog-id=133 op=LOAD Dec 16 12:19:08.737000 audit[2893]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.752222 kernel: audit: type=1300 audit(1765887548.737:432): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.754449 kernel: audit: type=1327 audit(1765887548.737:432): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.737000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.738000 audit: BPF prog-id=134 op=LOAD Dec 16 12:19:08.738000 audit[2893]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.738000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.738000 audit: BPF prog-id=134 op=UNLOAD Dec 16 12:19:08.738000 audit[2893]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.738000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.738000 audit: BPF prog-id=133 op=UNLOAD Dec 16 12:19:08.738000 audit[2893]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.738000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.738000 audit: BPF prog-id=135 op=LOAD Dec 16 12:19:08.738000 audit[2893]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2881 pid=2893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.738000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939326163383239316165633232653830313162366562666639333639 Dec 16 12:19:08.769972 containerd[1576]: time="2025-12-16T12:19:08.769891145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8tczh,Uid:38dbc406-75e4-46ee-bcb4-aafa6c59bf4e,Namespace:kube-system,Attempt:0,} returns sandbox id \"992ac8291aec22e8011b6ebff93693a14ffe636b63c44392b7e6a1c63c399806\"" Dec 16 12:19:08.777517 containerd[1576]: time="2025-12-16T12:19:08.777439888Z" level=info msg="CreateContainer within sandbox \"992ac8291aec22e8011b6ebff93693a14ffe636b63c44392b7e6a1c63c399806\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 12:19:08.778491 containerd[1576]: time="2025-12-16T12:19:08.778460063Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-n64mv,Uid:0c01355f-9665-410a-9a50-e4885ddd296a,Namespace:tigera-operator,Attempt:0,}" Dec 16 12:19:08.800194 containerd[1576]: time="2025-12-16T12:19:08.797552281Z" level=info msg="Container dac832476b2bd48cb1a14f81f6c7c2114f1c91284648c00bea9f2f9525961446: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:08.805544 containerd[1576]: time="2025-12-16T12:19:08.805493260Z" level=info msg="connecting to shim 3d98d97208929ab448becaafe6402441f3e2835fda6df524c238744b2228d46c" address="unix:///run/containerd/s/11d457f455a69769cb8e11ebb17de6b024e43111f23e59a6f3e7ffe391a2f391" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:08.810102 containerd[1576]: time="2025-12-16T12:19:08.810001800Z" level=info msg="CreateContainer within sandbox \"992ac8291aec22e8011b6ebff93693a14ffe636b63c44392b7e6a1c63c399806\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"dac832476b2bd48cb1a14f81f6c7c2114f1c91284648c00bea9f2f9525961446\"" Dec 16 12:19:08.811493 containerd[1576]: time="2025-12-16T12:19:08.811455375Z" level=info msg="StartContainer for \"dac832476b2bd48cb1a14f81f6c7c2114f1c91284648c00bea9f2f9525961446\"" Dec 16 12:19:08.815069 containerd[1576]: time="2025-12-16T12:19:08.815029988Z" level=info msg="connecting to shim dac832476b2bd48cb1a14f81f6c7c2114f1c91284648c00bea9f2f9525961446" address="unix:///run/containerd/s/b989dd968c23b0fec91c12bdec52b6eff0930df0fce677c3192447324aef0063" protocol=ttrpc version=3 Dec 16 12:19:08.850939 systemd[1]: Started cri-containerd-3d98d97208929ab448becaafe6402441f3e2835fda6df524c238744b2228d46c.scope - libcontainer container 3d98d97208929ab448becaafe6402441f3e2835fda6df524c238744b2228d46c. Dec 16 12:19:08.854830 systemd[1]: Started cri-containerd-dac832476b2bd48cb1a14f81f6c7c2114f1c91284648c00bea9f2f9525961446.scope - libcontainer container dac832476b2bd48cb1a14f81f6c7c2114f1c91284648c00bea9f2f9525961446. Dec 16 12:19:08.875000 audit: BPF prog-id=136 op=LOAD Dec 16 12:19:08.876000 audit: BPF prog-id=137 op=LOAD Dec 16 12:19:08.876000 audit[2939]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2927 pid=2939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.876000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364393864393732303839323961623434386265636161666536343032 Dec 16 12:19:08.876000 audit: BPF prog-id=137 op=UNLOAD Dec 16 12:19:08.876000 audit[2939]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2927 pid=2939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.876000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364393864393732303839323961623434386265636161666536343032 Dec 16 12:19:08.876000 audit: BPF prog-id=138 op=LOAD Dec 16 12:19:08.876000 audit[2939]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2927 pid=2939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.876000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364393864393732303839323961623434386265636161666536343032 Dec 16 12:19:08.876000 audit: BPF prog-id=139 op=LOAD Dec 16 12:19:08.876000 audit[2939]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2927 pid=2939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.876000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364393864393732303839323961623434386265636161666536343032 Dec 16 12:19:08.877000 audit: BPF prog-id=139 op=UNLOAD Dec 16 12:19:08.877000 audit[2939]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2927 pid=2939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.877000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364393864393732303839323961623434386265636161666536343032 Dec 16 12:19:08.877000 audit: BPF prog-id=138 op=UNLOAD Dec 16 12:19:08.877000 audit[2939]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2927 pid=2939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.877000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364393864393732303839323961623434386265636161666536343032 Dec 16 12:19:08.877000 audit: BPF prog-id=140 op=LOAD Dec 16 12:19:08.877000 audit[2939]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2927 pid=2939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.877000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364393864393732303839323961623434386265636161666536343032 Dec 16 12:19:08.912218 containerd[1576]: time="2025-12-16T12:19:08.912109627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-n64mv,Uid:0c01355f-9665-410a-9a50-e4885ddd296a,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3d98d97208929ab448becaafe6402441f3e2835fda6df524c238744b2228d46c\"" Dec 16 12:19:08.916000 audit: BPF prog-id=141 op=LOAD Dec 16 12:19:08.916000 audit[2940]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=2881 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.916000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461633833323437366232626434386362316131346638316636633763 Dec 16 12:19:08.916000 audit: BPF prog-id=142 op=LOAD Dec 16 12:19:08.916000 audit[2940]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=2881 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.916000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461633833323437366232626434386362316131346638316636633763 Dec 16 12:19:08.917000 audit: BPF prog-id=142 op=UNLOAD Dec 16 12:19:08.917000 audit[2940]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2881 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.917000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461633833323437366232626434386362316131346638316636633763 Dec 16 12:19:08.917000 audit: BPF prog-id=141 op=UNLOAD Dec 16 12:19:08.917000 audit[2940]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2881 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.917000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461633833323437366232626434386362316131346638316636633763 Dec 16 12:19:08.917000 audit: BPF prog-id=143 op=LOAD Dec 16 12:19:08.917000 audit[2940]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=2881 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:08.917000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461633833323437366232626434386362316131346638316636633763 Dec 16 12:19:08.922046 containerd[1576]: time="2025-12-16T12:19:08.916881032Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 12:19:08.953112 containerd[1576]: time="2025-12-16T12:19:08.952959631Z" level=info msg="StartContainer for \"dac832476b2bd48cb1a14f81f6c7c2114f1c91284648c00bea9f2f9525961446\" returns successfully" Dec 16 12:19:09.204000 audit[3028]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3028 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.204000 audit[3028]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe2f24990 a2=0 a3=1 items=0 ppid=2972 pid=3028 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.204000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:19:09.205000 audit[3029]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.205000 audit[3029]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffce5f0ae0 a2=0 a3=1 items=0 ppid=2972 pid=3029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.205000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:19:09.207000 audit[3030]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_chain pid=3030 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.207000 audit[3030]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffec26bc50 a2=0 a3=1 items=0 ppid=2972 pid=3030 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.207000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:19:09.209000 audit[3031]: NETFILTER_CFG table=mangle:57 family=10 entries=1 op=nft_register_chain pid=3031 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.209000 audit[3031]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffdefad920 a2=0 a3=1 items=0 ppid=2972 pid=3031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.209000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:19:09.211000 audit[3035]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3035 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.211000 audit[3035]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd65deb80 a2=0 a3=1 items=0 ppid=2972 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.211000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:19:09.218000 audit[3036]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3036 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.218000 audit[3036]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd0c33a90 a2=0 a3=1 items=0 ppid=2972 pid=3036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.218000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:19:09.313000 audit[3037]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.313000 audit[3037]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffc087c700 a2=0 a3=1 items=0 ppid=2972 pid=3037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.313000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:19:09.318000 audit[3039]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.318000 audit[3039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffceb02870 a2=0 a3=1 items=0 ppid=2972 pid=3039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.318000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Dec 16 12:19:09.322000 audit[3042]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3042 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.322000 audit[3042]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd25f6290 a2=0 a3=1 items=0 ppid=2972 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.322000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 12:19:09.323000 audit[3043]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.323000 audit[3043]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffea64b20 a2=0 a3=1 items=0 ppid=2972 pid=3043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.323000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:19:09.326000 audit[3045]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.326000 audit[3045]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff6a244b0 a2=0 a3=1 items=0 ppid=2972 pid=3045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.326000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:19:09.328000 audit[3046]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3046 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.328000 audit[3046]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcfde8400 a2=0 a3=1 items=0 ppid=2972 pid=3046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.328000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:19:09.330000 audit[3048]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.330000 audit[3048]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd15c69d0 a2=0 a3=1 items=0 ppid=2972 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.330000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:19:09.335000 audit[3051]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.335000 audit[3051]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffff524f530 a2=0 a3=1 items=0 ppid=2972 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.335000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:19:09.337000 audit[3052]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.337000 audit[3052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc8daae20 a2=0 a3=1 items=0 ppid=2972 pid=3052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.337000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:19:09.340000 audit[3054]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.340000 audit[3054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc100ce70 a2=0 a3=1 items=0 ppid=2972 pid=3054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.340000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:19:09.342000 audit[3055]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3055 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.342000 audit[3055]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe0232600 a2=0 a3=1 items=0 ppid=2972 pid=3055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.342000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:19:09.345000 audit[3057]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.345000 audit[3057]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffecb12b20 a2=0 a3=1 items=0 ppid=2972 pid=3057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.345000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Dec 16 12:19:09.353000 audit[3060]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3060 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.353000 audit[3060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff0b5c430 a2=0 a3=1 items=0 ppid=2972 pid=3060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.353000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 12:19:09.358000 audit[3063]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.358000 audit[3063]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffca3c8170 a2=0 a3=1 items=0 ppid=2972 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.358000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 12:19:09.360000 audit[3064]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3064 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.360000 audit[3064]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff3e83ef0 a2=0 a3=1 items=0 ppid=2972 pid=3064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.360000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:19:09.362000 audit[3066]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.362000 audit[3066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=fffff4165a80 a2=0 a3=1 items=0 ppid=2972 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.362000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:19:09.368000 audit[3069]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3069 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.368000 audit[3069]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc36d1fa0 a2=0 a3=1 items=0 ppid=2972 pid=3069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.368000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:19:09.370000 audit[3070]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3070 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.370000 audit[3070]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc8e57390 a2=0 a3=1 items=0 ppid=2972 pid=3070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.370000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:19:09.373000 audit[3072]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3072 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:19:09.373000 audit[3072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffe35aca40 a2=0 a3=1 items=0 ppid=2972 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.373000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:19:09.400000 audit[3078]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3078 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:09.400000 audit[3078]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffcd3f45a0 a2=0 a3=1 items=0 ppid=2972 pid=3078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.400000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:09.412000 audit[3078]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3078 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:09.412000 audit[3078]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffcd3f45a0 a2=0 a3=1 items=0 ppid=2972 pid=3078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.412000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:09.415000 audit[3083]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3083 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.415000 audit[3083]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffebe636c0 a2=0 a3=1 items=0 ppid=2972 pid=3083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.415000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:19:09.419000 audit[3085]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3085 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.419000 audit[3085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffe943a510 a2=0 a3=1 items=0 ppid=2972 pid=3085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.419000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 12:19:09.424000 audit[3088]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.424000 audit[3088]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffedd5b230 a2=0 a3=1 items=0 ppid=2972 pid=3088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.424000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Dec 16 12:19:09.425000 audit[3089]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3089 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.425000 audit[3089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc82a5dc0 a2=0 a3=1 items=0 ppid=2972 pid=3089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.425000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:19:09.429000 audit[3091]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3091 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.429000 audit[3091]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff895ef00 a2=0 a3=1 items=0 ppid=2972 pid=3091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.429000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:19:09.430000 audit[3092]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.430000 audit[3092]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd9c74fc0 a2=0 a3=1 items=0 ppid=2972 pid=3092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.430000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:19:09.433000 audit[3094]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.433000 audit[3094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffff28717f0 a2=0 a3=1 items=0 ppid=2972 pid=3094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.433000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:19:09.438000 audit[3097]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.438000 audit[3097]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffe4243f00 a2=0 a3=1 items=0 ppid=2972 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.438000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:19:09.439000 audit[3098]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3098 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.439000 audit[3098]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffcd259f0 a2=0 a3=1 items=0 ppid=2972 pid=3098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.439000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:19:09.443000 audit[3100]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.443000 audit[3100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe0f343b0 a2=0 a3=1 items=0 ppid=2972 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.443000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:19:09.445000 audit[3101]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3101 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.445000 audit[3101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff6923b00 a2=0 a3=1 items=0 ppid=2972 pid=3101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.445000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:19:09.449000 audit[3103]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3103 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.449000 audit[3103]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd521c860 a2=0 a3=1 items=0 ppid=2972 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.449000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 12:19:09.453000 audit[3106]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.453000 audit[3106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe1542a80 a2=0 a3=1 items=0 ppid=2972 pid=3106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.453000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 12:19:09.459000 audit[3109]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.459000 audit[3109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffc3a50020 a2=0 a3=1 items=0 ppid=2972 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.459000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Dec 16 12:19:09.462000 audit[3110]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3110 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.462000 audit[3110]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffdf95b400 a2=0 a3=1 items=0 ppid=2972 pid=3110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.462000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:19:09.465000 audit[3112]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.465000 audit[3112]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffd434ddc0 a2=0 a3=1 items=0 ppid=2972 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.465000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:19:09.470000 audit[3115]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3115 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.470000 audit[3115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe5c116c0 a2=0 a3=1 items=0 ppid=2972 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.470000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:19:09.471000 audit[3116]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3116 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.471000 audit[3116]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff6676500 a2=0 a3=1 items=0 ppid=2972 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.471000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:19:09.475000 audit[3118]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3118 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.475000 audit[3118]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffd3f18720 a2=0 a3=1 items=0 ppid=2972 pid=3118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.475000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:19:09.476000 audit[3119]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3119 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.476000 audit[3119]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdac49540 a2=0 a3=1 items=0 ppid=2972 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.476000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:19:09.479000 audit[3121]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.479000 audit[3121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffca47b320 a2=0 a3=1 items=0 ppid=2972 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.479000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:19:09.483000 audit[3124]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:19:09.483000 audit[3124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffc2805b10 a2=0 a3=1 items=0 ppid=2972 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.483000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:19:09.487000 audit[3126]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:19:09.487000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffd35303f0 a2=0 a3=1 items=0 ppid=2972 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.487000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:09.487000 audit[3126]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:19:09.487000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffd35303f0 a2=0 a3=1 items=0 ppid=2972 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:09.487000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:09.528486 kubelet[2821]: I1216 12:19:09.527998 2821 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-8tczh" podStartSLOduration=1.5279789419999998 podStartE2EDuration="1.527978942s" podCreationTimestamp="2025-12-16 12:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:19:09.527664714 +0000 UTC m=+7.263234676" watchObservedRunningTime="2025-12-16 12:19:09.527978942 +0000 UTC m=+7.263548824" Dec 16 12:19:09.559025 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2280965028.mount: Deactivated successfully. Dec 16 12:19:11.463373 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount290407922.mount: Deactivated successfully. Dec 16 12:19:11.942863 containerd[1576]: time="2025-12-16T12:19:11.942795176Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:11.944566 containerd[1576]: time="2025-12-16T12:19:11.944475632Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Dec 16 12:19:11.945440 containerd[1576]: time="2025-12-16T12:19:11.945385866Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:11.948372 containerd[1576]: time="2025-12-16T12:19:11.948295661Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:11.949664 containerd[1576]: time="2025-12-16T12:19:11.949593525Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 3.031776687s" Dec 16 12:19:11.949830 containerd[1576]: time="2025-12-16T12:19:11.949709655Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 16 12:19:11.955752 containerd[1576]: time="2025-12-16T12:19:11.955612532Z" level=info msg="CreateContainer within sandbox \"3d98d97208929ab448becaafe6402441f3e2835fda6df524c238744b2228d46c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 12:19:11.965082 containerd[1576]: time="2025-12-16T12:19:11.961555452Z" level=info msg="Container 9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:11.970470 containerd[1576]: time="2025-12-16T12:19:11.970422088Z" level=info msg="CreateContainer within sandbox \"3d98d97208929ab448becaafe6402441f3e2835fda6df524c238744b2228d46c\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e\"" Dec 16 12:19:11.971760 containerd[1576]: time="2025-12-16T12:19:11.971312880Z" level=info msg="StartContainer for \"9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e\"" Dec 16 12:19:11.973746 containerd[1576]: time="2025-12-16T12:19:11.973678591Z" level=info msg="connecting to shim 9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e" address="unix:///run/containerd/s/11d457f455a69769cb8e11ebb17de6b024e43111f23e59a6f3e7ffe391a2f391" protocol=ttrpc version=3 Dec 16 12:19:12.003092 systemd[1]: Started cri-containerd-9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e.scope - libcontainer container 9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e. Dec 16 12:19:12.017000 audit: BPF prog-id=144 op=LOAD Dec 16 12:19:12.018000 audit: BPF prog-id=145 op=LOAD Dec 16 12:19:12.018000 audit[3135]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186180 a2=98 a3=0 items=0 ppid=2927 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:12.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961383137646231616139366433353237303162373831333635666666 Dec 16 12:19:12.018000 audit: BPF prog-id=145 op=UNLOAD Dec 16 12:19:12.018000 audit[3135]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2927 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:12.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961383137646231616139366433353237303162373831333635666666 Dec 16 12:19:12.018000 audit: BPF prog-id=146 op=LOAD Dec 16 12:19:12.018000 audit[3135]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001863e8 a2=98 a3=0 items=0 ppid=2927 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:12.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961383137646231616139366433353237303162373831333635666666 Dec 16 12:19:12.018000 audit: BPF prog-id=147 op=LOAD Dec 16 12:19:12.018000 audit[3135]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000186168 a2=98 a3=0 items=0 ppid=2927 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:12.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961383137646231616139366433353237303162373831333635666666 Dec 16 12:19:12.018000 audit: BPF prog-id=147 op=UNLOAD Dec 16 12:19:12.018000 audit[3135]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2927 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:12.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961383137646231616139366433353237303162373831333635666666 Dec 16 12:19:12.018000 audit: BPF prog-id=146 op=UNLOAD Dec 16 12:19:12.018000 audit[3135]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2927 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:12.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961383137646231616139366433353237303162373831333635666666 Dec 16 12:19:12.018000 audit: BPF prog-id=148 op=LOAD Dec 16 12:19:12.018000 audit[3135]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186648 a2=98 a3=0 items=0 ppid=2927 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:12.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961383137646231616139366433353237303162373831333635666666 Dec 16 12:19:12.040739 containerd[1576]: time="2025-12-16T12:19:12.040694744Z" level=info msg="StartContainer for \"9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e\" returns successfully" Dec 16 12:19:15.487669 kubelet[2821]: I1216 12:19:15.487185 2821 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-n64mv" podStartSLOduration=4.45200894 podStartE2EDuration="7.486604197s" podCreationTimestamp="2025-12-16 12:19:08 +0000 UTC" firstStartedPulling="2025-12-16 12:19:08.916015471 +0000 UTC m=+6.651585313" lastFinishedPulling="2025-12-16 12:19:11.950610728 +0000 UTC m=+9.686180570" observedRunningTime="2025-12-16 12:19:12.544505384 +0000 UTC m=+10.280075226" watchObservedRunningTime="2025-12-16 12:19:15.486604197 +0000 UTC m=+13.222174119" Dec 16 12:19:16.346384 kernel: kauditd_printk_skb: 224 callbacks suppressed Dec 16 12:19:16.346502 kernel: audit: type=1106 audit(1765887556.342:509): pid=1899 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:19:16.342000 audit[1899]: USER_END pid=1899 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:19:16.343437 sudo[1899]: pam_unix(sudo:session): session closed for user root Dec 16 12:19:16.345000 audit[1899]: CRED_DISP pid=1899 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:19:16.349404 kernel: audit: type=1104 audit(1765887556.345:510): pid=1899 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:19:16.504438 sshd[1882]: Connection closed by 147.75.109.163 port 39552 Dec 16 12:19:16.508040 sshd-session[1878]: pam_unix(sshd:session): session closed for user core Dec 16 12:19:16.510000 audit[1878]: USER_END pid=1878 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:19:16.510000 audit[1878]: CRED_DISP pid=1878 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:19:16.516862 kernel: audit: type=1106 audit(1765887556.510:511): pid=1878 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:19:16.516948 kernel: audit: type=1104 audit(1765887556.510:512): pid=1878 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:19:16.516435 systemd[1]: sshd@6-46.224.130.63:22-147.75.109.163:39552.service: Deactivated successfully. Dec 16 12:19:16.516000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-46.224.130.63:22-147.75.109.163:39552 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:16.519534 kernel: audit: type=1131 audit(1765887556.516:513): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-46.224.130.63:22-147.75.109.163:39552 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:16.521412 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 12:19:16.522986 systemd[1]: session-8.scope: Consumed 7.771s CPU time, 221.1M memory peak. Dec 16 12:19:16.526074 systemd-logind[1543]: Session 8 logged out. Waiting for processes to exit. Dec 16 12:19:16.530069 systemd-logind[1543]: Removed session 8. Dec 16 12:19:20.215000 audit[3214]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3214 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:20.215000 audit[3214]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd177cfe0 a2=0 a3=1 items=0 ppid=2972 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:20.220564 kernel: audit: type=1325 audit(1765887560.215:514): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3214 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:20.220718 kernel: audit: type=1300 audit(1765887560.215:514): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd177cfe0 a2=0 a3=1 items=0 ppid=2972 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:20.215000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:20.222647 kernel: audit: type=1327 audit(1765887560.215:514): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:20.223000 audit[3214]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3214 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:20.226650 kernel: audit: type=1325 audit(1765887560.223:515): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3214 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:20.223000 audit[3214]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd177cfe0 a2=0 a3=1 items=0 ppid=2972 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:20.223000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:20.234640 kernel: audit: type=1300 audit(1765887560.223:515): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd177cfe0 a2=0 a3=1 items=0 ppid=2972 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:21.257000 audit[3216]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3216 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:21.257000 audit[3216]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd2ebf3d0 a2=0 a3=1 items=0 ppid=2972 pid=3216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:21.257000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:21.262000 audit[3216]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3216 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:21.262000 audit[3216]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd2ebf3d0 a2=0 a3=1 items=0 ppid=2972 pid=3216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:21.262000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:25.609000 audit[3220]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:25.610984 kernel: kauditd_printk_skb: 7 callbacks suppressed Dec 16 12:19:25.611164 kernel: audit: type=1325 audit(1765887565.609:518): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:25.609000 audit[3220]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffd6101b60 a2=0 a3=1 items=0 ppid=2972 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:25.615574 kernel: audit: type=1300 audit(1765887565.609:518): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffd6101b60 a2=0 a3=1 items=0 ppid=2972 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:25.609000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:25.616880 kernel: audit: type=1327 audit(1765887565.609:518): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:25.620000 audit[3220]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:25.620000 audit[3220]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd6101b60 a2=0 a3=1 items=0 ppid=2972 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:25.624723 kernel: audit: type=1325 audit(1765887565.620:519): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:25.624839 kernel: audit: type=1300 audit(1765887565.620:519): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd6101b60 a2=0 a3=1 items=0 ppid=2972 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:25.620000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:25.627197 kernel: audit: type=1327 audit(1765887565.620:519): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:26.672000 audit[3223]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3223 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:26.672000 audit[3223]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffda08d5d0 a2=0 a3=1 items=0 ppid=2972 pid=3223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:26.676775 kernel: audit: type=1325 audit(1765887566.672:520): table=filter:111 family=2 entries=19 op=nft_register_rule pid=3223 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:26.676877 kernel: audit: type=1300 audit(1765887566.672:520): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffda08d5d0 a2=0 a3=1 items=0 ppid=2972 pid=3223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:26.678931 kernel: audit: type=1327 audit(1765887566.672:520): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:26.672000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:26.679000 audit[3223]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3223 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:26.679000 audit[3223]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffda08d5d0 a2=0 a3=1 items=0 ppid=2972 pid=3223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:26.679000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:26.683653 kernel: audit: type=1325 audit(1765887566.679:521): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3223 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:28.690000 audit[3225]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3225 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:28.690000 audit[3225]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffdbdc8460 a2=0 a3=1 items=0 ppid=2972 pid=3225 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:28.690000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:28.695000 audit[3225]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3225 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:28.695000 audit[3225]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdbdc8460 a2=0 a3=1 items=0 ppid=2972 pid=3225 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:28.695000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:28.741399 systemd[1]: Created slice kubepods-besteffort-podcd0c4eb5_974e_4b22_ad44_b8e0ac04125d.slice - libcontainer container kubepods-besteffort-podcd0c4eb5_974e_4b22_ad44_b8e0ac04125d.slice. Dec 16 12:19:28.748819 kubelet[2821]: E1216 12:19:28.748752 2821 reflector.go:205] "Failed to watch" err="failed to list *v1.Secret: secrets \"typha-certs\" is forbidden: User \"system:node:ci-4547-0-0-5-8fe0b910ae\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4547-0-0-5-8fe0b910ae' and this object" logger="UnhandledError" reflector="object-\"calico-system\"/\"typha-certs\"" type="*v1.Secret" Dec 16 12:19:28.772821 kubelet[2821]: I1216 12:19:28.772769 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnp82\" (UniqueName: \"kubernetes.io/projected/cd0c4eb5-974e-4b22-ad44-b8e0ac04125d-kube-api-access-fnp82\") pod \"calico-typha-7584978976-9slwf\" (UID: \"cd0c4eb5-974e-4b22-ad44-b8e0ac04125d\") " pod="calico-system/calico-typha-7584978976-9slwf" Dec 16 12:19:28.772821 kubelet[2821]: I1216 12:19:28.772817 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd0c4eb5-974e-4b22-ad44-b8e0ac04125d-tigera-ca-bundle\") pod \"calico-typha-7584978976-9slwf\" (UID: \"cd0c4eb5-974e-4b22-ad44-b8e0ac04125d\") " pod="calico-system/calico-typha-7584978976-9slwf" Dec 16 12:19:28.773009 kubelet[2821]: I1216 12:19:28.772853 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/cd0c4eb5-974e-4b22-ad44-b8e0ac04125d-typha-certs\") pod \"calico-typha-7584978976-9slwf\" (UID: \"cd0c4eb5-974e-4b22-ad44-b8e0ac04125d\") " pod="calico-system/calico-typha-7584978976-9slwf" Dec 16 12:19:28.861010 systemd[1]: Created slice kubepods-besteffort-pod1f516d62_f3d8_4caa_bebd_d5f9a0d03ea7.slice - libcontainer container kubepods-besteffort-pod1f516d62_f3d8_4caa_bebd_d5f9a0d03ea7.slice. Dec 16 12:19:28.874469 kubelet[2821]: I1216 12:19:28.873988 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-var-lib-calico\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874469 kubelet[2821]: I1216 12:19:28.874479 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959qs\" (UniqueName: \"kubernetes.io/projected/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-kube-api-access-959qs\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874646 kubelet[2821]: I1216 12:19:28.874502 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-lib-modules\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874646 kubelet[2821]: I1216 12:19:28.874530 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-node-certs\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874646 kubelet[2821]: I1216 12:19:28.874553 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-xtables-lock\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874646 kubelet[2821]: I1216 12:19:28.874569 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-var-run-calico\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874646 kubelet[2821]: I1216 12:19:28.874587 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-cni-bin-dir\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874769 kubelet[2821]: I1216 12:19:28.874610 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-flexvol-driver-host\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874769 kubelet[2821]: I1216 12:19:28.874647 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-cni-log-dir\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874769 kubelet[2821]: I1216 12:19:28.874663 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-policysync\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874769 kubelet[2821]: I1216 12:19:28.874678 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-cni-net-dir\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.874769 kubelet[2821]: I1216 12:19:28.874699 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7-tigera-ca-bundle\") pod \"calico-node-s82gj\" (UID: \"1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7\") " pod="calico-system/calico-node-s82gj" Dec 16 12:19:28.980724 kubelet[2821]: E1216 12:19:28.979176 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:19:28.988128 kubelet[2821]: E1216 12:19:28.985973 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.988128 kubelet[2821]: W1216 12:19:28.986003 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.988128 kubelet[2821]: E1216 12:19:28.986026 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.989887 kubelet[2821]: E1216 12:19:28.989843 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.989887 kubelet[2821]: W1216 12:19:28.989875 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.990050 kubelet[2821]: E1216 12:19:28.989899 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.990358 kubelet[2821]: E1216 12:19:28.990324 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.990358 kubelet[2821]: W1216 12:19:28.990345 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.990358 kubelet[2821]: E1216 12:19:28.990360 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.992702 kubelet[2821]: E1216 12:19:28.992668 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.992702 kubelet[2821]: W1216 12:19:28.992693 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.992855 kubelet[2821]: E1216 12:19:28.992714 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.993453 kubelet[2821]: E1216 12:19:28.993428 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.993453 kubelet[2821]: W1216 12:19:28.993451 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.994163 kubelet[2821]: E1216 12:19:28.994129 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.994720 kubelet[2821]: E1216 12:19:28.994697 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.994825 kubelet[2821]: W1216 12:19:28.994717 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.994870 kubelet[2821]: E1216 12:19:28.994828 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.995313 kubelet[2821]: E1216 12:19:28.995293 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.995313 kubelet[2821]: W1216 12:19:28.995310 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.995387 kubelet[2821]: E1216 12:19:28.995322 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.996132 kubelet[2821]: E1216 12:19:28.996110 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.996132 kubelet[2821]: W1216 12:19:28.996130 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.996198 kubelet[2821]: E1216 12:19:28.996142 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.997762 kubelet[2821]: E1216 12:19:28.996673 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.997762 kubelet[2821]: W1216 12:19:28.997758 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.997881 kubelet[2821]: E1216 12:19:28.997773 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.998046 kubelet[2821]: E1216 12:19:28.998024 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.998075 kubelet[2821]: W1216 12:19:28.998055 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.998075 kubelet[2821]: E1216 12:19:28.998067 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.998731 kubelet[2821]: E1216 12:19:28.998707 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.998731 kubelet[2821]: W1216 12:19:28.998723 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.998731 kubelet[2821]: E1216 12:19:28.998734 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.999088 kubelet[2821]: E1216 12:19:28.999059 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.999088 kubelet[2821]: W1216 12:19:28.999076 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.999088 kubelet[2821]: E1216 12:19:28.999088 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.999297 kubelet[2821]: E1216 12:19:28.999272 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.999297 kubelet[2821]: W1216 12:19:28.999298 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.999364 kubelet[2821]: E1216 12:19:28.999308 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.999526 kubelet[2821]: E1216 12:19:28.999501 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.999526 kubelet[2821]: W1216 12:19:28.999514 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.999590 kubelet[2821]: E1216 12:19:28.999523 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:28.999899 kubelet[2821]: E1216 12:19:28.999868 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:28.999899 kubelet[2821]: W1216 12:19:28.999886 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:28.999899 kubelet[2821]: E1216 12:19:28.999897 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.000277 kubelet[2821]: E1216 12:19:29.000157 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.000277 kubelet[2821]: W1216 12:19:29.000271 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.000339 kubelet[2821]: E1216 12:19:29.000283 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.000699 kubelet[2821]: E1216 12:19:29.000678 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.000699 kubelet[2821]: W1216 12:19:29.000693 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.000816 kubelet[2821]: E1216 12:19:29.000703 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.001649 kubelet[2821]: E1216 12:19:29.001332 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.001649 kubelet[2821]: W1216 12:19:29.001351 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.001649 kubelet[2821]: E1216 12:19:29.001361 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.020003 kubelet[2821]: E1216 12:19:29.019906 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.020134 kubelet[2821]: W1216 12:19:29.019973 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.020134 kubelet[2821]: E1216 12:19:29.020127 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.053504 kubelet[2821]: E1216 12:19:29.053462 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.053504 kubelet[2821]: W1216 12:19:29.053496 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.053682 kubelet[2821]: E1216 12:19:29.053521 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.054068 kubelet[2821]: E1216 12:19:29.054021 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.054126 kubelet[2821]: W1216 12:19:29.054048 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.054126 kubelet[2821]: E1216 12:19:29.054095 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.054363 kubelet[2821]: E1216 12:19:29.054337 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.054363 kubelet[2821]: W1216 12:19:29.054354 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.054363 kubelet[2821]: E1216 12:19:29.054366 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.054844 kubelet[2821]: E1216 12:19:29.054802 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.054844 kubelet[2821]: W1216 12:19:29.054824 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.054933 kubelet[2821]: E1216 12:19:29.054855 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.055665 kubelet[2821]: E1216 12:19:29.055586 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.055665 kubelet[2821]: W1216 12:19:29.055607 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.055665 kubelet[2821]: E1216 12:19:29.055650 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.055876 kubelet[2821]: E1216 12:19:29.055789 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.055876 kubelet[2821]: W1216 12:19:29.055801 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.055876 kubelet[2821]: E1216 12:19:29.055809 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.056401 kubelet[2821]: E1216 12:19:29.056363 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.056401 kubelet[2821]: W1216 12:19:29.056384 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.056401 kubelet[2821]: E1216 12:19:29.056396 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.056582 kubelet[2821]: E1216 12:19:29.056560 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.056582 kubelet[2821]: W1216 12:19:29.056573 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.056582 kubelet[2821]: E1216 12:19:29.056582 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.056759 kubelet[2821]: E1216 12:19:29.056740 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.056759 kubelet[2821]: W1216 12:19:29.056753 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.056818 kubelet[2821]: E1216 12:19:29.056763 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.058886 kubelet[2821]: E1216 12:19:29.058853 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.058886 kubelet[2821]: W1216 12:19:29.058873 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.058886 kubelet[2821]: E1216 12:19:29.058887 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.059895 kubelet[2821]: E1216 12:19:29.059833 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.059895 kubelet[2821]: W1216 12:19:29.059893 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.059972 kubelet[2821]: E1216 12:19:29.059907 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.060118 kubelet[2821]: E1216 12:19:29.060100 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.060118 kubelet[2821]: W1216 12:19:29.060115 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.060182 kubelet[2821]: E1216 12:19:29.060126 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.060351 kubelet[2821]: E1216 12:19:29.060335 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.060351 kubelet[2821]: W1216 12:19:29.060347 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.060414 kubelet[2821]: E1216 12:19:29.060358 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.060564 kubelet[2821]: E1216 12:19:29.060542 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.060564 kubelet[2821]: W1216 12:19:29.060559 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.060924 kubelet[2821]: E1216 12:19:29.060569 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.061156 kubelet[2821]: E1216 12:19:29.061126 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.061156 kubelet[2821]: W1216 12:19:29.061147 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.061156 kubelet[2821]: E1216 12:19:29.061158 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.061453 kubelet[2821]: E1216 12:19:29.061429 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.061453 kubelet[2821]: W1216 12:19:29.061446 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.061655 kubelet[2821]: E1216 12:19:29.061457 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.061825 kubelet[2821]: E1216 12:19:29.061800 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.061825 kubelet[2821]: W1216 12:19:29.061817 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.061906 kubelet[2821]: E1216 12:19:29.061829 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.062892 kubelet[2821]: E1216 12:19:29.062857 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.062892 kubelet[2821]: W1216 12:19:29.062880 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.062892 kubelet[2821]: E1216 12:19:29.062893 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.063216 kubelet[2821]: E1216 12:19:29.063072 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.063216 kubelet[2821]: W1216 12:19:29.063086 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.063216 kubelet[2821]: E1216 12:19:29.063094 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.063309 kubelet[2821]: E1216 12:19:29.063234 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.063309 kubelet[2821]: W1216 12:19:29.063241 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.063309 kubelet[2821]: E1216 12:19:29.063249 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.077241 kubelet[2821]: E1216 12:19:29.077183 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.077241 kubelet[2821]: W1216 12:19:29.077212 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.077241 kubelet[2821]: E1216 12:19:29.077232 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.077713 kubelet[2821]: I1216 12:19:29.077261 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/30279a80-ac32-4c4e-affe-8e2742945896-registration-dir\") pod \"csi-node-driver-6nr2l\" (UID: \"30279a80-ac32-4c4e-affe-8e2742945896\") " pod="calico-system/csi-node-driver-6nr2l" Dec 16 12:19:29.077713 kubelet[2821]: E1216 12:19:29.077473 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.077713 kubelet[2821]: W1216 12:19:29.077493 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.077713 kubelet[2821]: E1216 12:19:29.077503 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.078090 kubelet[2821]: I1216 12:19:29.077813 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/30279a80-ac32-4c4e-affe-8e2742945896-socket-dir\") pod \"csi-node-driver-6nr2l\" (UID: \"30279a80-ac32-4c4e-affe-8e2742945896\") " pod="calico-system/csi-node-driver-6nr2l" Dec 16 12:19:29.078279 kubelet[2821]: E1216 12:19:29.078217 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.078279 kubelet[2821]: W1216 12:19:29.078238 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.078279 kubelet[2821]: E1216 12:19:29.078251 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.078582 kubelet[2821]: E1216 12:19:29.078466 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.078582 kubelet[2821]: W1216 12:19:29.078486 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.078582 kubelet[2821]: E1216 12:19:29.078496 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.078716 kubelet[2821]: E1216 12:19:29.078684 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.078716 kubelet[2821]: W1216 12:19:29.078693 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.078716 kubelet[2821]: E1216 12:19:29.078702 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.078793 kubelet[2821]: I1216 12:19:29.078729 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67gck\" (UniqueName: \"kubernetes.io/projected/30279a80-ac32-4c4e-affe-8e2742945896-kube-api-access-67gck\") pod \"csi-node-driver-6nr2l\" (UID: \"30279a80-ac32-4c4e-affe-8e2742945896\") " pod="calico-system/csi-node-driver-6nr2l" Dec 16 12:19:29.079106 kubelet[2821]: E1216 12:19:29.079065 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.079106 kubelet[2821]: W1216 12:19:29.079090 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.079106 kubelet[2821]: E1216 12:19:29.079104 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.079209 kubelet[2821]: I1216 12:19:29.079131 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/30279a80-ac32-4c4e-affe-8e2742945896-varrun\") pod \"csi-node-driver-6nr2l\" (UID: \"30279a80-ac32-4c4e-affe-8e2742945896\") " pod="calico-system/csi-node-driver-6nr2l" Dec 16 12:19:29.079398 kubelet[2821]: E1216 12:19:29.079380 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.079398 kubelet[2821]: W1216 12:19:29.079396 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.079398 kubelet[2821]: E1216 12:19:29.079407 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.080369 kubelet[2821]: E1216 12:19:29.080330 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.080369 kubelet[2821]: W1216 12:19:29.080356 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.080369 kubelet[2821]: E1216 12:19:29.080370 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.080691 kubelet[2821]: E1216 12:19:29.080648 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.080691 kubelet[2821]: W1216 12:19:29.080663 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.080691 kubelet[2821]: E1216 12:19:29.080673 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.081139 kubelet[2821]: E1216 12:19:29.081108 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.081139 kubelet[2821]: W1216 12:19:29.081131 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.081224 kubelet[2821]: E1216 12:19:29.081147 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.081429 kubelet[2821]: E1216 12:19:29.081409 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.081429 kubelet[2821]: W1216 12:19:29.081426 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.081497 kubelet[2821]: E1216 12:19:29.081437 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.081556 kubelet[2821]: I1216 12:19:29.081532 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/30279a80-ac32-4c4e-affe-8e2742945896-kubelet-dir\") pod \"csi-node-driver-6nr2l\" (UID: \"30279a80-ac32-4c4e-affe-8e2742945896\") " pod="calico-system/csi-node-driver-6nr2l" Dec 16 12:19:29.083067 kubelet[2821]: E1216 12:19:29.083021 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.083663 kubelet[2821]: W1216 12:19:29.083166 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.083663 kubelet[2821]: E1216 12:19:29.083188 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.083993 kubelet[2821]: E1216 12:19:29.083976 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.084158 kubelet[2821]: W1216 12:19:29.084081 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.084228 kubelet[2821]: E1216 12:19:29.084215 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.085296 kubelet[2821]: E1216 12:19:29.085271 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.085476 kubelet[2821]: W1216 12:19:29.085392 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.085476 kubelet[2821]: E1216 12:19:29.085416 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.085853 kubelet[2821]: E1216 12:19:29.085774 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.085853 kubelet[2821]: W1216 12:19:29.085795 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.085853 kubelet[2821]: E1216 12:19:29.085809 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.168512 containerd[1576]: time="2025-12-16T12:19:29.168453110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-s82gj,Uid:1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:29.182348 kubelet[2821]: E1216 12:19:29.182262 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.182348 kubelet[2821]: W1216 12:19:29.182341 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.182499 kubelet[2821]: E1216 12:19:29.182377 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.185410 kubelet[2821]: E1216 12:19:29.184746 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.185410 kubelet[2821]: W1216 12:19:29.184808 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.185410 kubelet[2821]: E1216 12:19:29.184915 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.185684 kubelet[2821]: E1216 12:19:29.185641 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.185684 kubelet[2821]: W1216 12:19:29.185663 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.185684 kubelet[2821]: E1216 12:19:29.185677 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.185936 kubelet[2821]: E1216 12:19:29.185915 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.185936 kubelet[2821]: W1216 12:19:29.185927 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.185936 kubelet[2821]: E1216 12:19:29.185936 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.186756 kubelet[2821]: E1216 12:19:29.186104 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.186756 kubelet[2821]: W1216 12:19:29.186112 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.186756 kubelet[2821]: E1216 12:19:29.186121 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.188747 kubelet[2821]: E1216 12:19:29.187899 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.188747 kubelet[2821]: W1216 12:19:29.187924 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.188747 kubelet[2821]: E1216 12:19:29.187940 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.188747 kubelet[2821]: E1216 12:19:29.188098 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.188747 kubelet[2821]: W1216 12:19:29.188105 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.188747 kubelet[2821]: E1216 12:19:29.188114 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.189380 kubelet[2821]: E1216 12:19:29.189277 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.189380 kubelet[2821]: W1216 12:19:29.189295 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.189380 kubelet[2821]: E1216 12:19:29.189309 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.190617 kubelet[2821]: E1216 12:19:29.190508 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.190617 kubelet[2821]: W1216 12:19:29.190527 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.190617 kubelet[2821]: E1216 12:19:29.190541 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.190937 kubelet[2821]: E1216 12:19:29.190773 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.190937 kubelet[2821]: W1216 12:19:29.190789 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.190937 kubelet[2821]: E1216 12:19:29.190799 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.191639 kubelet[2821]: E1216 12:19:29.190953 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.191639 kubelet[2821]: W1216 12:19:29.190961 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.191639 kubelet[2821]: E1216 12:19:29.190970 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.191639 kubelet[2821]: E1216 12:19:29.191098 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.191639 kubelet[2821]: W1216 12:19:29.191106 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.191639 kubelet[2821]: E1216 12:19:29.191116 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.191639 kubelet[2821]: E1216 12:19:29.191383 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.191639 kubelet[2821]: W1216 12:19:29.191401 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.191639 kubelet[2821]: E1216 12:19:29.191415 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.192189 kubelet[2821]: E1216 12:19:29.192058 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.192189 kubelet[2821]: W1216 12:19:29.192073 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.192189 kubelet[2821]: E1216 12:19:29.192086 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.194035 kubelet[2821]: E1216 12:19:29.193759 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.194035 kubelet[2821]: W1216 12:19:29.193783 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.194035 kubelet[2821]: E1216 12:19:29.193804 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.194363 kubelet[2821]: E1216 12:19:29.194211 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.194363 kubelet[2821]: W1216 12:19:29.194228 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.194363 kubelet[2821]: E1216 12:19:29.194240 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.194636 kubelet[2821]: E1216 12:19:29.194531 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.194636 kubelet[2821]: W1216 12:19:29.194559 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.194636 kubelet[2821]: E1216 12:19:29.194571 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.195168 kubelet[2821]: E1216 12:19:29.195014 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.195168 kubelet[2821]: W1216 12:19:29.195030 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.195168 kubelet[2821]: E1216 12:19:29.195044 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.195427 kubelet[2821]: E1216 12:19:29.195315 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.195427 kubelet[2821]: W1216 12:19:29.195327 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.195427 kubelet[2821]: E1216 12:19:29.195337 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.195588 kubelet[2821]: E1216 12:19:29.195574 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.195667 kubelet[2821]: W1216 12:19:29.195654 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.196055 kubelet[2821]: E1216 12:19:29.195924 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.196267 kubelet[2821]: E1216 12:19:29.196254 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.196343 kubelet[2821]: W1216 12:19:29.196329 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.196411 kubelet[2821]: E1216 12:19:29.196400 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.196775 kubelet[2821]: E1216 12:19:29.196760 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.197104 kubelet[2821]: W1216 12:19:29.196923 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.197104 kubelet[2821]: E1216 12:19:29.196957 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.197275 kubelet[2821]: E1216 12:19:29.197257 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.197452 kubelet[2821]: W1216 12:19:29.197330 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.197452 kubelet[2821]: E1216 12:19:29.197347 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.197660 kubelet[2821]: E1216 12:19:29.197640 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.197725 kubelet[2821]: W1216 12:19:29.197709 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.197925 kubelet[2821]: E1216 12:19:29.197782 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.198076 kubelet[2821]: E1216 12:19:29.198064 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.198156 kubelet[2821]: W1216 12:19:29.198136 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.198272 kubelet[2821]: E1216 12:19:29.198258 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.201364 containerd[1576]: time="2025-12-16T12:19:29.200934566Z" level=info msg="connecting to shim 5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef" address="unix:///run/containerd/s/e8d77f0605ce36c6a53a658a13c530d642220b2839f6c975169663d7a1618770" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:29.221157 kubelet[2821]: E1216 12:19:29.221060 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.221157 kubelet[2821]: W1216 12:19:29.221091 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.221157 kubelet[2821]: E1216 12:19:29.221112 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.241130 systemd[1]: Started cri-containerd-5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef.scope - libcontainer container 5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef. Dec 16 12:19:29.262000 audit: BPF prog-id=149 op=LOAD Dec 16 12:19:29.263000 audit: BPF prog-id=150 op=LOAD Dec 16 12:19:29.263000 audit[3336]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3323 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:29.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566313265633932613837393337616338373439646633656662323466 Dec 16 12:19:29.263000 audit: BPF prog-id=150 op=UNLOAD Dec 16 12:19:29.263000 audit[3336]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:29.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566313265633932613837393337616338373439646633656662323466 Dec 16 12:19:29.263000 audit: BPF prog-id=151 op=LOAD Dec 16 12:19:29.263000 audit[3336]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3323 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:29.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566313265633932613837393337616338373439646633656662323466 Dec 16 12:19:29.264000 audit: BPF prog-id=152 op=LOAD Dec 16 12:19:29.264000 audit[3336]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3323 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:29.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566313265633932613837393337616338373439646633656662323466 Dec 16 12:19:29.264000 audit: BPF prog-id=152 op=UNLOAD Dec 16 12:19:29.264000 audit[3336]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:29.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566313265633932613837393337616338373439646633656662323466 Dec 16 12:19:29.264000 audit: BPF prog-id=151 op=UNLOAD Dec 16 12:19:29.264000 audit[3336]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:29.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566313265633932613837393337616338373439646633656662323466 Dec 16 12:19:29.264000 audit: BPF prog-id=153 op=LOAD Dec 16 12:19:29.264000 audit[3336]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3323 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:29.264000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566313265633932613837393337616338373439646633656662323466 Dec 16 12:19:29.283776 containerd[1576]: time="2025-12-16T12:19:29.283734011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-s82gj,Uid:1f516d62-f3d8-4caa-bebd-d5f9a0d03ea7,Namespace:calico-system,Attempt:0,} returns sandbox id \"5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef\"" Dec 16 12:19:29.286915 containerd[1576]: time="2025-12-16T12:19:29.286861499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 12:19:29.720000 audit[3365]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3365 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:29.720000 audit[3365]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffd294eaf0 a2=0 a3=1 items=0 ppid=2972 pid=3365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:29.720000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:29.725000 audit[3365]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3365 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:29.725000 audit[3365]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd294eaf0 a2=0 a3=1 items=0 ppid=2972 pid=3365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:29.725000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:29.875930 kubelet[2821]: E1216 12:19:29.875352 2821 secret.go:189] Couldn't get secret calico-system/typha-certs: failed to sync secret cache: timed out waiting for the condition Dec 16 12:19:29.875930 kubelet[2821]: E1216 12:19:29.875506 2821 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd0c4eb5-974e-4b22-ad44-b8e0ac04125d-typha-certs podName:cd0c4eb5-974e-4b22-ad44-b8e0ac04125d nodeName:}" failed. No retries permitted until 2025-12-16 12:19:30.375466224 +0000 UTC m=+28.111036146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "typha-certs" (UniqueName: "kubernetes.io/secret/cd0c4eb5-974e-4b22-ad44-b8e0ac04125d-typha-certs") pod "calico-typha-7584978976-9slwf" (UID: "cd0c4eb5-974e-4b22-ad44-b8e0ac04125d") : failed to sync secret cache: timed out waiting for the condition Dec 16 12:19:29.897512 kubelet[2821]: E1216 12:19:29.897438 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.897512 kubelet[2821]: W1216 12:19:29.897471 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.897984 kubelet[2821]: E1216 12:19:29.897827 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:29.999369 kubelet[2821]: E1216 12:19:29.999206 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:29.999369 kubelet[2821]: W1216 12:19:29.999243 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:29.999369 kubelet[2821]: E1216 12:19:29.999274 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.100673 kubelet[2821]: E1216 12:19:30.100516 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:30.100673 kubelet[2821]: W1216 12:19:30.100555 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:30.100673 kubelet[2821]: E1216 12:19:30.100587 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.202132 kubelet[2821]: E1216 12:19:30.201975 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:30.202132 kubelet[2821]: W1216 12:19:30.202015 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:30.202132 kubelet[2821]: E1216 12:19:30.202048 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.303053 kubelet[2821]: E1216 12:19:30.302938 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:30.303053 kubelet[2821]: W1216 12:19:30.302976 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:30.303053 kubelet[2821]: E1216 12:19:30.303001 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.404727 kubelet[2821]: E1216 12:19:30.404691 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:30.404727 kubelet[2821]: W1216 12:19:30.404718 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:30.405089 kubelet[2821]: E1216 12:19:30.404747 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.405089 kubelet[2821]: E1216 12:19:30.405085 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:30.405147 kubelet[2821]: W1216 12:19:30.405097 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:30.405147 kubelet[2821]: E1216 12:19:30.405109 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.405297 kubelet[2821]: E1216 12:19:30.405286 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:30.405297 kubelet[2821]: W1216 12:19:30.405297 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:30.405357 kubelet[2821]: E1216 12:19:30.405315 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.405465 kubelet[2821]: E1216 12:19:30.405451 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:30.405514 kubelet[2821]: W1216 12:19:30.405470 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:30.405514 kubelet[2821]: E1216 12:19:30.405480 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.405756 kubelet[2821]: E1216 12:19:30.405744 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:30.405951 kubelet[2821]: W1216 12:19:30.405757 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:30.405951 kubelet[2821]: E1216 12:19:30.405768 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.413943 kubelet[2821]: E1216 12:19:30.413748 2821 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:19:30.413943 kubelet[2821]: W1216 12:19:30.413775 2821 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:19:30.413943 kubelet[2821]: E1216 12:19:30.413797 2821 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:19:30.439217 kubelet[2821]: E1216 12:19:30.439160 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:19:30.552458 containerd[1576]: time="2025-12-16T12:19:30.552407600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7584978976-9slwf,Uid:cd0c4eb5-974e-4b22-ad44-b8e0ac04125d,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:30.577152 containerd[1576]: time="2025-12-16T12:19:30.577017104Z" level=info msg="connecting to shim 7bf75ddb3ffd5416c0962e449dfaa1b0dbafb9f3d33003478b0369c1d5ec6641" address="unix:///run/containerd/s/e4bc625c205f68448001e51a3ecb274b7ff5b917d4ca467a8424821f9d89f1f7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:30.618939 systemd[1]: Started cri-containerd-7bf75ddb3ffd5416c0962e449dfaa1b0dbafb9f3d33003478b0369c1d5ec6641.scope - libcontainer container 7bf75ddb3ffd5416c0962e449dfaa1b0dbafb9f3d33003478b0369c1d5ec6641. Dec 16 12:19:30.632000 audit: BPF prog-id=154 op=LOAD Dec 16 12:19:30.636144 kernel: kauditd_printk_skb: 36 callbacks suppressed Dec 16 12:19:30.636197 kernel: audit: type=1334 audit(1765887570.632:534): prog-id=154 op=LOAD Dec 16 12:19:30.634000 audit: BPF prog-id=155 op=LOAD Dec 16 12:19:30.634000 audit[3399]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.639316 kernel: audit: type=1334 audit(1765887570.634:535): prog-id=155 op=LOAD Dec 16 12:19:30.639402 kernel: audit: type=1300 audit(1765887570.634:535): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.634000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.642657 kernel: audit: type=1327 audit(1765887570.634:535): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.642765 kernel: audit: type=1334 audit(1765887570.634:536): prog-id=155 op=UNLOAD Dec 16 12:19:30.634000 audit: BPF prog-id=155 op=UNLOAD Dec 16 12:19:30.634000 audit[3399]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.645158 kernel: audit: type=1300 audit(1765887570.634:536): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.634000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.652934 kernel: audit: type=1327 audit(1765887570.634:536): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.657801 kernel: audit: type=1334 audit(1765887570.634:537): prog-id=156 op=LOAD Dec 16 12:19:30.657865 kernel: audit: type=1300 audit(1765887570.634:537): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.634000 audit: BPF prog-id=156 op=LOAD Dec 16 12:19:30.634000 audit[3399]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.634000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.660815 kernel: audit: type=1327 audit(1765887570.634:537): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.635000 audit: BPF prog-id=157 op=LOAD Dec 16 12:19:30.635000 audit[3399]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.635000 audit: BPF prog-id=157 op=UNLOAD Dec 16 12:19:30.635000 audit[3399]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.635000 audit: BPF prog-id=156 op=UNLOAD Dec 16 12:19:30.635000 audit[3399]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.635000 audit: BPF prog-id=158 op=LOAD Dec 16 12:19:30.635000 audit[3399]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=3387 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762663735646462336666643534313663303936326534343964666161 Dec 16 12:19:30.679182 containerd[1576]: time="2025-12-16T12:19:30.679032424Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7584978976-9slwf,Uid:cd0c4eb5-974e-4b22-ad44-b8e0ac04125d,Namespace:calico-system,Attempt:0,} returns sandbox id \"7bf75ddb3ffd5416c0962e449dfaa1b0dbafb9f3d33003478b0369c1d5ec6641\"" Dec 16 12:19:30.804943 containerd[1576]: time="2025-12-16T12:19:30.804874057Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:30.808875 containerd[1576]: time="2025-12-16T12:19:30.808806374Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:30.810953 containerd[1576]: time="2025-12-16T12:19:30.810836775Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:30.816737 containerd[1576]: time="2025-12-16T12:19:30.815285353Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:30.817823 containerd[1576]: time="2025-12-16T12:19:30.817787533Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.530877512s" Dec 16 12:19:30.817972 containerd[1576]: time="2025-12-16T12:19:30.817953700Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 16 12:19:30.821122 containerd[1576]: time="2025-12-16T12:19:30.821060744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 12:19:30.826672 containerd[1576]: time="2025-12-16T12:19:30.825838615Z" level=info msg="CreateContainer within sandbox \"5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 12:19:30.835855 containerd[1576]: time="2025-12-16T12:19:30.835747372Z" level=info msg="Container d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:30.853082 containerd[1576]: time="2025-12-16T12:19:30.853003662Z" level=info msg="CreateContainer within sandbox \"5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e\"" Dec 16 12:19:30.855026 containerd[1576]: time="2025-12-16T12:19:30.854982381Z" level=info msg="StartContainer for \"d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e\"" Dec 16 12:19:30.857450 containerd[1576]: time="2025-12-16T12:19:30.857401878Z" level=info msg="connecting to shim d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e" address="unix:///run/containerd/s/e8d77f0605ce36c6a53a658a13c530d642220b2839f6c975169663d7a1618770" protocol=ttrpc version=3 Dec 16 12:19:30.881939 systemd[1]: Started cri-containerd-d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e.scope - libcontainer container d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e. Dec 16 12:19:30.893133 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2398709091.mount: Deactivated successfully. Dec 16 12:19:30.929000 audit: BPF prog-id=159 op=LOAD Dec 16 12:19:30.929000 audit[3433]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40002283e8 a2=98 a3=0 items=0 ppid=3323 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.929000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333738623830316435323765633362633635616230373761306265 Dec 16 12:19:30.930000 audit: BPF prog-id=160 op=LOAD Dec 16 12:19:30.930000 audit[3433]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000228168 a2=98 a3=0 items=0 ppid=3323 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333738623830316435323765633362633635616230373761306265 Dec 16 12:19:30.930000 audit: BPF prog-id=160 op=UNLOAD Dec 16 12:19:30.930000 audit[3433]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333738623830316435323765633362633635616230373761306265 Dec 16 12:19:30.930000 audit: BPF prog-id=159 op=UNLOAD Dec 16 12:19:30.930000 audit[3433]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333738623830316435323765633362633635616230373761306265 Dec 16 12:19:30.930000 audit: BPF prog-id=161 op=LOAD Dec 16 12:19:30.930000 audit[3433]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000228648 a2=98 a3=0 items=0 ppid=3323 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:30.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437333738623830316435323765633362633635616230373761306265 Dec 16 12:19:30.962124 containerd[1576]: time="2025-12-16T12:19:30.962055103Z" level=info msg="StartContainer for \"d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e\" returns successfully" Dec 16 12:19:30.980617 systemd[1]: cri-containerd-d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e.scope: Deactivated successfully. Dec 16 12:19:30.982000 audit: BPF prog-id=161 op=UNLOAD Dec 16 12:19:30.986377 containerd[1576]: time="2025-12-16T12:19:30.986313113Z" level=info msg="received container exit event container_id:\"d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e\" id:\"d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e\" pid:3446 exited_at:{seconds:1765887570 nanos:985820573}" Dec 16 12:19:31.016498 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d7378b801d527ec3bc65ab077a0be55ed0139c744c13eb15699df37c1a32ab8e-rootfs.mount: Deactivated successfully. Dec 16 12:19:32.441612 kubelet[2821]: E1216 12:19:32.441565 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:19:33.252841 containerd[1576]: time="2025-12-16T12:19:33.252796260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:33.254817 containerd[1576]: time="2025-12-16T12:19:33.254706251Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Dec 16 12:19:33.255659 containerd[1576]: time="2025-12-16T12:19:33.255547082Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:33.258049 containerd[1576]: time="2025-12-16T12:19:33.257983292Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:33.259223 containerd[1576]: time="2025-12-16T12:19:33.258840164Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.437723017s" Dec 16 12:19:33.259223 containerd[1576]: time="2025-12-16T12:19:33.258887765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 16 12:19:33.261239 containerd[1576]: time="2025-12-16T12:19:33.260895920Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 12:19:33.280138 containerd[1576]: time="2025-12-16T12:19:33.280073549Z" level=info msg="CreateContainer within sandbox \"7bf75ddb3ffd5416c0962e449dfaa1b0dbafb9f3d33003478b0369c1d5ec6641\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 12:19:33.291455 containerd[1576]: time="2025-12-16T12:19:33.291410049Z" level=info msg="Container 82e7f1c2d7660e486e72ff672b587ca39a62221a3e1716f59629a75597792dde: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:33.301589 containerd[1576]: time="2025-12-16T12:19:33.301517183Z" level=info msg="CreateContainer within sandbox \"7bf75ddb3ffd5416c0962e449dfaa1b0dbafb9f3d33003478b0369c1d5ec6641\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"82e7f1c2d7660e486e72ff672b587ca39a62221a3e1716f59629a75597792dde\"" Dec 16 12:19:33.303652 containerd[1576]: time="2025-12-16T12:19:33.303057360Z" level=info msg="StartContainer for \"82e7f1c2d7660e486e72ff672b587ca39a62221a3e1716f59629a75597792dde\"" Dec 16 12:19:33.305559 containerd[1576]: time="2025-12-16T12:19:33.305452969Z" level=info msg="connecting to shim 82e7f1c2d7660e486e72ff672b587ca39a62221a3e1716f59629a75597792dde" address="unix:///run/containerd/s/e4bc625c205f68448001e51a3ecb274b7ff5b917d4ca467a8424821f9d89f1f7" protocol=ttrpc version=3 Dec 16 12:19:33.331048 systemd[1]: Started cri-containerd-82e7f1c2d7660e486e72ff672b587ca39a62221a3e1716f59629a75597792dde.scope - libcontainer container 82e7f1c2d7660e486e72ff672b587ca39a62221a3e1716f59629a75597792dde. Dec 16 12:19:33.346000 audit: BPF prog-id=162 op=LOAD Dec 16 12:19:33.348000 audit: BPF prog-id=163 op=LOAD Dec 16 12:19:33.348000 audit[3492]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=3387 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:33.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832653766316332643736363065343836653732666636373262353837 Dec 16 12:19:33.348000 audit: BPF prog-id=163 op=UNLOAD Dec 16 12:19:33.348000 audit[3492]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3387 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:33.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832653766316332643736363065343836653732666636373262353837 Dec 16 12:19:33.349000 audit: BPF prog-id=164 op=LOAD Dec 16 12:19:33.349000 audit[3492]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=3387 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:33.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832653766316332643736363065343836653732666636373262353837 Dec 16 12:19:33.349000 audit: BPF prog-id=165 op=LOAD Dec 16 12:19:33.349000 audit[3492]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=3387 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:33.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832653766316332643736363065343836653732666636373262353837 Dec 16 12:19:33.349000 audit: BPF prog-id=165 op=UNLOAD Dec 16 12:19:33.349000 audit[3492]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3387 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:33.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832653766316332643736363065343836653732666636373262353837 Dec 16 12:19:33.349000 audit: BPF prog-id=164 op=UNLOAD Dec 16 12:19:33.349000 audit[3492]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3387 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:33.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832653766316332643736363065343836653732666636373262353837 Dec 16 12:19:33.350000 audit: BPF prog-id=166 op=LOAD Dec 16 12:19:33.350000 audit[3492]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=3387 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:33.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832653766316332643736363065343836653732666636373262353837 Dec 16 12:19:33.387806 containerd[1576]: time="2025-12-16T12:19:33.387764695Z" level=info msg="StartContainer for \"82e7f1c2d7660e486e72ff672b587ca39a62221a3e1716f59629a75597792dde\" returns successfully" Dec 16 12:19:33.676660 kubelet[2821]: I1216 12:19:33.676480 2821 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7584978976-9slwf" podStartSLOduration=3.09702718 podStartE2EDuration="5.676464181s" podCreationTimestamp="2025-12-16 12:19:28 +0000 UTC" firstStartedPulling="2025-12-16 12:19:30.680789574 +0000 UTC m=+28.416359456" lastFinishedPulling="2025-12-16 12:19:33.260226535 +0000 UTC m=+30.995796457" observedRunningTime="2025-12-16 12:19:33.67564115 +0000 UTC m=+31.411211032" watchObservedRunningTime="2025-12-16 12:19:33.676464181 +0000 UTC m=+31.412034063" Dec 16 12:19:34.441301 kubelet[2821]: E1216 12:19:34.440210 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:19:34.605220 kubelet[2821]: I1216 12:19:34.605181 2821 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:19:35.959864 containerd[1576]: time="2025-12-16T12:19:35.959758400Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:35.961227 containerd[1576]: time="2025-12-16T12:19:35.961062766Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Dec 16 12:19:35.963404 containerd[1576]: time="2025-12-16T12:19:35.962203686Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:35.964581 containerd[1576]: time="2025-12-16T12:19:35.964529248Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:35.965391 containerd[1576]: time="2025-12-16T12:19:35.965352477Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.703849735s" Dec 16 12:19:35.965391 containerd[1576]: time="2025-12-16T12:19:35.965392759Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 16 12:19:35.971525 containerd[1576]: time="2025-12-16T12:19:35.971484414Z" level=info msg="CreateContainer within sandbox \"5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 12:19:35.988649 containerd[1576]: time="2025-12-16T12:19:35.987668545Z" level=info msg="Container 8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:35.992778 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2303121786.mount: Deactivated successfully. Dec 16 12:19:36.004238 containerd[1576]: time="2025-12-16T12:19:36.004180127Z" level=info msg="CreateContainer within sandbox \"5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b\"" Dec 16 12:19:36.005477 containerd[1576]: time="2025-12-16T12:19:36.005175921Z" level=info msg="StartContainer for \"8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b\"" Dec 16 12:19:36.007280 containerd[1576]: time="2025-12-16T12:19:36.007244833Z" level=info msg="connecting to shim 8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b" address="unix:///run/containerd/s/e8d77f0605ce36c6a53a658a13c530d642220b2839f6c975169663d7a1618770" protocol=ttrpc version=3 Dec 16 12:19:36.042947 systemd[1]: Started cri-containerd-8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b.scope - libcontainer container 8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b. Dec 16 12:19:36.096000 audit: BPF prog-id=167 op=LOAD Dec 16 12:19:36.099104 kernel: kauditd_printk_skb: 50 callbacks suppressed Dec 16 12:19:36.099216 kernel: audit: type=1334 audit(1765887576.096:556): prog-id=167 op=LOAD Dec 16 12:19:36.096000 audit[3535]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3323 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:36.101658 kernel: audit: type=1300 audit(1765887576.096:556): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3323 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:36.101768 kernel: audit: type=1327 audit(1765887576.096:556): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835393165373831366237623534316331646639353633633337316434 Dec 16 12:19:36.096000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835393165373831366237623534316331646639353633633337316434 Dec 16 12:19:36.097000 audit: BPF prog-id=168 op=LOAD Dec 16 12:19:36.097000 audit[3535]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3323 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:36.106368 kernel: audit: type=1334 audit(1765887576.097:557): prog-id=168 op=LOAD Dec 16 12:19:36.106526 kernel: audit: type=1300 audit(1765887576.097:557): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3323 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:36.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835393165373831366237623534316331646639353633633337316434 Dec 16 12:19:36.108947 kernel: audit: type=1327 audit(1765887576.097:557): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835393165373831366237623534316331646639353633633337316434 Dec 16 12:19:36.109257 kernel: audit: type=1334 audit(1765887576.097:558): prog-id=168 op=UNLOAD Dec 16 12:19:36.097000 audit: BPF prog-id=168 op=UNLOAD Dec 16 12:19:36.109852 kernel: audit: type=1300 audit(1765887576.097:558): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:36.097000 audit[3535]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:36.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835393165373831366237623534316331646639353633633337316434 Dec 16 12:19:36.114461 kernel: audit: type=1327 audit(1765887576.097:558): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835393165373831366237623534316331646639353633633337316434 Dec 16 12:19:36.097000 audit: BPF prog-id=167 op=UNLOAD Dec 16 12:19:36.115749 kernel: audit: type=1334 audit(1765887576.097:559): prog-id=167 op=UNLOAD Dec 16 12:19:36.097000 audit[3535]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:36.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835393165373831366237623534316331646639353633633337316434 Dec 16 12:19:36.097000 audit: BPF prog-id=169 op=LOAD Dec 16 12:19:36.097000 audit[3535]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=3323 pid=3535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:36.097000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835393165373831366237623534316331646639353633633337316434 Dec 16 12:19:36.141663 containerd[1576]: time="2025-12-16T12:19:36.141536233Z" level=info msg="StartContainer for \"8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b\" returns successfully" Dec 16 12:19:36.440850 kubelet[2821]: E1216 12:19:36.439890 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:19:36.674414 containerd[1576]: time="2025-12-16T12:19:36.674002033Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:19:36.676776 systemd[1]: cri-containerd-8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b.scope: Deactivated successfully. Dec 16 12:19:36.678790 systemd[1]: cri-containerd-8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b.scope: Consumed 508ms CPU time, 192.8M memory peak, 165.9M written to disk. Dec 16 12:19:36.680000 audit: BPF prog-id=169 op=UNLOAD Dec 16 12:19:36.682803 containerd[1576]: time="2025-12-16T12:19:36.681507372Z" level=info msg="received container exit event container_id:\"8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b\" id:\"8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b\" pid:3548 exited_at:{seconds:1765887576 nanos:681199082}" Dec 16 12:19:36.699769 kubelet[2821]: I1216 12:19:36.698753 2821 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 16 12:19:36.721491 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8591e7816b7b541c1df9563c371d4d6327af326a6db26c3e9a442f2db5a4180b-rootfs.mount: Deactivated successfully. Dec 16 12:19:36.803345 systemd[1]: Created slice kubepods-burstable-pod53dd6675_4fbd_4c22_90a9_ace54315889a.slice - libcontainer container kubepods-burstable-pod53dd6675_4fbd_4c22_90a9_ace54315889a.slice. Dec 16 12:19:36.819530 systemd[1]: Created slice kubepods-burstable-pod2c875d65_0c1c_4f65_a11e_e47b73dda454.slice - libcontainer container kubepods-burstable-pod2c875d65_0c1c_4f65_a11e_e47b73dda454.slice. Dec 16 12:19:36.831601 systemd[1]: Created slice kubepods-besteffort-podb4c4df94_c3ee_426c_b06d_ed9edc99469b.slice - libcontainer container kubepods-besteffort-podb4c4df94_c3ee_426c_b06d_ed9edc99469b.slice. Dec 16 12:19:36.843224 systemd[1]: Created slice kubepods-besteffort-pod6ad46257_20db_42d2_b357_93e753f0c2ca.slice - libcontainer container kubepods-besteffort-pod6ad46257_20db_42d2_b357_93e753f0c2ca.slice. Dec 16 12:19:36.856264 systemd[1]: Created slice kubepods-besteffort-podbbfa367f_11d3_466c_9181_c6ee23836f5f.slice - libcontainer container kubepods-besteffort-podbbfa367f_11d3_466c_9181_c6ee23836f5f.slice. Dec 16 12:19:36.857666 kubelet[2821]: I1216 12:19:36.857613 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad46257-20db-42d2-b357-93e753f0c2ca-config\") pod \"goldmane-7c778bb748-85msp\" (UID: \"6ad46257-20db-42d2-b357-93e753f0c2ca\") " pod="calico-system/goldmane-7c778bb748-85msp" Dec 16 12:19:36.857884 kubelet[2821]: I1216 12:19:36.857865 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ad46257-20db-42d2-b357-93e753f0c2ca-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-85msp\" (UID: \"6ad46257-20db-42d2-b357-93e753f0c2ca\") " pod="calico-system/goldmane-7c778bb748-85msp" Dec 16 12:19:36.858152 kubelet[2821]: I1216 12:19:36.858131 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn8bx\" (UniqueName: \"kubernetes.io/projected/b4c4df94-c3ee-426c-b06d-ed9edc99469b-kube-api-access-gn8bx\") pod \"calico-apiserver-85ddfd7b99-s9l56\" (UID: \"b4c4df94-c3ee-426c-b06d-ed9edc99469b\") " pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" Dec 16 12:19:36.858356 kubelet[2821]: I1216 12:19:36.858312 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/78447202-a453-4865-9c1c-3b28d64baee4-whisker-backend-key-pair\") pod \"whisker-849969496b-sljp9\" (UID: \"78447202-a453-4865-9c1c-3b28d64baee4\") " pod="calico-system/whisker-849969496b-sljp9" Dec 16 12:19:36.858876 kubelet[2821]: I1216 12:19:36.858854 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2nrr\" (UniqueName: \"kubernetes.io/projected/78447202-a453-4865-9c1c-3b28d64baee4-kube-api-access-h2nrr\") pod \"whisker-849969496b-sljp9\" (UID: \"78447202-a453-4865-9c1c-3b28d64baee4\") " pod="calico-system/whisker-849969496b-sljp9" Dec 16 12:19:36.859129 kubelet[2821]: I1216 12:19:36.859111 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmnmq\" (UniqueName: \"kubernetes.io/projected/53dd6675-4fbd-4c22-90a9-ace54315889a-kube-api-access-dmnmq\") pod \"coredns-66bc5c9577-zf8pk\" (UID: \"53dd6675-4fbd-4c22-90a9-ace54315889a\") " pod="kube-system/coredns-66bc5c9577-zf8pk" Dec 16 12:19:36.859258 kubelet[2821]: I1216 12:19:36.859245 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbfa367f-11d3-466c-9181-c6ee23836f5f-tigera-ca-bundle\") pod \"calico-kube-controllers-5d865fff8d-bxz6x\" (UID: \"bbfa367f-11d3-466c-9181-c6ee23836f5f\") " pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" Dec 16 12:19:36.859367 kubelet[2821]: I1216 12:19:36.859356 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/6ad46257-20db-42d2-b357-93e753f0c2ca-goldmane-key-pair\") pod \"goldmane-7c778bb748-85msp\" (UID: \"6ad46257-20db-42d2-b357-93e753f0c2ca\") " pod="calico-system/goldmane-7c778bb748-85msp" Dec 16 12:19:36.859544 kubelet[2821]: I1216 12:19:36.859484 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b4c4df94-c3ee-426c-b06d-ed9edc99469b-calico-apiserver-certs\") pod \"calico-apiserver-85ddfd7b99-s9l56\" (UID: \"b4c4df94-c3ee-426c-b06d-ed9edc99469b\") " pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" Dec 16 12:19:36.860039 kubelet[2821]: I1216 12:19:36.860001 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c875d65-0c1c-4f65-a11e-e47b73dda454-config-volume\") pod \"coredns-66bc5c9577-v6jwv\" (UID: \"2c875d65-0c1c-4f65-a11e-e47b73dda454\") " pod="kube-system/coredns-66bc5c9577-v6jwv" Dec 16 12:19:36.860449 kubelet[2821]: I1216 12:19:36.860413 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x2s7\" (UniqueName: \"kubernetes.io/projected/2c875d65-0c1c-4f65-a11e-e47b73dda454-kube-api-access-6x2s7\") pod \"coredns-66bc5c9577-v6jwv\" (UID: \"2c875d65-0c1c-4f65-a11e-e47b73dda454\") " pod="kube-system/coredns-66bc5c9577-v6jwv" Dec 16 12:19:36.861299 kubelet[2821]: I1216 12:19:36.861075 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqzwk\" (UniqueName: \"kubernetes.io/projected/4dd8654f-30cc-4aed-bf7b-e1600d664a65-kube-api-access-jqzwk\") pod \"calico-apiserver-85ddfd7b99-lzr5b\" (UID: \"4dd8654f-30cc-4aed-bf7b-e1600d664a65\") " pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" Dec 16 12:19:36.861299 kubelet[2821]: I1216 12:19:36.861116 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78447202-a453-4865-9c1c-3b28d64baee4-whisker-ca-bundle\") pod \"whisker-849969496b-sljp9\" (UID: \"78447202-a453-4865-9c1c-3b28d64baee4\") " pod="calico-system/whisker-849969496b-sljp9" Dec 16 12:19:36.861299 kubelet[2821]: I1216 12:19:36.861135 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53dd6675-4fbd-4c22-90a9-ace54315889a-config-volume\") pod \"coredns-66bc5c9577-zf8pk\" (UID: \"53dd6675-4fbd-4c22-90a9-ace54315889a\") " pod="kube-system/coredns-66bc5c9577-zf8pk" Dec 16 12:19:36.861299 kubelet[2821]: I1216 12:19:36.861156 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4dd8654f-30cc-4aed-bf7b-e1600d664a65-calico-apiserver-certs\") pod \"calico-apiserver-85ddfd7b99-lzr5b\" (UID: \"4dd8654f-30cc-4aed-bf7b-e1600d664a65\") " pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" Dec 16 12:19:36.861299 kubelet[2821]: I1216 12:19:36.861171 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjsnk\" (UniqueName: \"kubernetes.io/projected/6ad46257-20db-42d2-b357-93e753f0c2ca-kube-api-access-gjsnk\") pod \"goldmane-7c778bb748-85msp\" (UID: \"6ad46257-20db-42d2-b357-93e753f0c2ca\") " pod="calico-system/goldmane-7c778bb748-85msp" Dec 16 12:19:36.861444 kubelet[2821]: I1216 12:19:36.861193 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rrb\" (UniqueName: \"kubernetes.io/projected/bbfa367f-11d3-466c-9181-c6ee23836f5f-kube-api-access-s9rrb\") pod \"calico-kube-controllers-5d865fff8d-bxz6x\" (UID: \"bbfa367f-11d3-466c-9181-c6ee23836f5f\") " pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" Dec 16 12:19:36.869415 systemd[1]: Created slice kubepods-besteffort-pod78447202_a453_4865_9c1c_3b28d64baee4.slice - libcontainer container kubepods-besteffort-pod78447202_a453_4865_9c1c_3b28d64baee4.slice. Dec 16 12:19:36.878677 systemd[1]: Created slice kubepods-besteffort-pod4dd8654f_30cc_4aed_bf7b_e1600d664a65.slice - libcontainer container kubepods-besteffort-pod4dd8654f_30cc_4aed_bf7b_e1600d664a65.slice. Dec 16 12:19:37.119720 containerd[1576]: time="2025-12-16T12:19:37.119552264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zf8pk,Uid:53dd6675-4fbd-4c22-90a9-ace54315889a,Namespace:kube-system,Attempt:0,}" Dec 16 12:19:37.129549 containerd[1576]: time="2025-12-16T12:19:37.129492281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-v6jwv,Uid:2c875d65-0c1c-4f65-a11e-e47b73dda454,Namespace:kube-system,Attempt:0,}" Dec 16 12:19:37.142751 containerd[1576]: time="2025-12-16T12:19:37.142712448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85ddfd7b99-s9l56,Uid:b4c4df94-c3ee-426c-b06d-ed9edc99469b,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:19:37.154090 containerd[1576]: time="2025-12-16T12:19:37.154018511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-85msp,Uid:6ad46257-20db-42d2-b357-93e753f0c2ca,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:37.167355 containerd[1576]: time="2025-12-16T12:19:37.167213837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d865fff8d-bxz6x,Uid:bbfa367f-11d3-466c-9181-c6ee23836f5f,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:37.178130 containerd[1576]: time="2025-12-16T12:19:37.178005402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-849969496b-sljp9,Uid:78447202-a453-4865-9c1c-3b28d64baee4,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:37.186130 containerd[1576]: time="2025-12-16T12:19:37.185949351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85ddfd7b99-lzr5b,Uid:4dd8654f-30cc-4aed-bf7b-e1600d664a65,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:19:37.375152 containerd[1576]: time="2025-12-16T12:19:37.374883424Z" level=error msg="Failed to destroy network for sandbox \"48b04d65a51c8f51dab44ef85150fbc15898e4a1fc6d068df88300e9b425c9fe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.384131 containerd[1576]: time="2025-12-16T12:19:37.384025533Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-v6jwv,Uid:2c875d65-0c1c-4f65-a11e-e47b73dda454,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b04d65a51c8f51dab44ef85150fbc15898e4a1fc6d068df88300e9b425c9fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.384416 kubelet[2821]: E1216 12:19:37.384319 2821 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b04d65a51c8f51dab44ef85150fbc15898e4a1fc6d068df88300e9b425c9fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.384416 kubelet[2821]: E1216 12:19:37.384394 2821 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b04d65a51c8f51dab44ef85150fbc15898e4a1fc6d068df88300e9b425c9fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-v6jwv" Dec 16 12:19:37.384416 kubelet[2821]: E1216 12:19:37.384412 2821 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b04d65a51c8f51dab44ef85150fbc15898e4a1fc6d068df88300e9b425c9fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-v6jwv" Dec 16 12:19:37.384507 kubelet[2821]: E1216 12:19:37.384465 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-v6jwv_kube-system(2c875d65-0c1c-4f65-a11e-e47b73dda454)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-v6jwv_kube-system(2c875d65-0c1c-4f65-a11e-e47b73dda454)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48b04d65a51c8f51dab44ef85150fbc15898e4a1fc6d068df88300e9b425c9fe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-v6jwv" podUID="2c875d65-0c1c-4f65-a11e-e47b73dda454" Dec 16 12:19:37.426375 containerd[1576]: time="2025-12-16T12:19:37.426305764Z" level=error msg="Failed to destroy network for sandbox \"0b9316c35f8a41554210a625d3f72c62cef8ee60fb6c899dd6f83625fe398bce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.435496 containerd[1576]: time="2025-12-16T12:19:37.435356870Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d865fff8d-bxz6x,Uid:bbfa367f-11d3-466c-9181-c6ee23836f5f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b9316c35f8a41554210a625d3f72c62cef8ee60fb6c899dd6f83625fe398bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.435799 kubelet[2821]: E1216 12:19:37.435701 2821 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b9316c35f8a41554210a625d3f72c62cef8ee60fb6c899dd6f83625fe398bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.435799 kubelet[2821]: E1216 12:19:37.435760 2821 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b9316c35f8a41554210a625d3f72c62cef8ee60fb6c899dd6f83625fe398bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" Dec 16 12:19:37.435799 kubelet[2821]: E1216 12:19:37.435779 2821 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b9316c35f8a41554210a625d3f72c62cef8ee60fb6c899dd6f83625fe398bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" Dec 16 12:19:37.436517 kubelet[2821]: E1216 12:19:37.435834 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5d865fff8d-bxz6x_calico-system(bbfa367f-11d3-466c-9181-c6ee23836f5f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5d865fff8d-bxz6x_calico-system(bbfa367f-11d3-466c-9181-c6ee23836f5f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b9316c35f8a41554210a625d3f72c62cef8ee60fb6c899dd6f83625fe398bce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:19:37.440329 containerd[1576]: time="2025-12-16T12:19:37.440233915Z" level=error msg="Failed to destroy network for sandbox \"4c92a1a90e60893ea5bef55e9abbd46f50814063be6c8a8809f263da63371d61\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.446598 containerd[1576]: time="2025-12-16T12:19:37.446529968Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zf8pk,Uid:53dd6675-4fbd-4c22-90a9-ace54315889a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c92a1a90e60893ea5bef55e9abbd46f50814063be6c8a8809f263da63371d61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.446926 kubelet[2821]: E1216 12:19:37.446798 2821 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c92a1a90e60893ea5bef55e9abbd46f50814063be6c8a8809f263da63371d61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.446926 kubelet[2821]: E1216 12:19:37.446867 2821 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c92a1a90e60893ea5bef55e9abbd46f50814063be6c8a8809f263da63371d61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-zf8pk" Dec 16 12:19:37.446926 kubelet[2821]: E1216 12:19:37.446885 2821 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c92a1a90e60893ea5bef55e9abbd46f50814063be6c8a8809f263da63371d61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-zf8pk" Dec 16 12:19:37.447924 kubelet[2821]: E1216 12:19:37.446939 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-zf8pk_kube-system(53dd6675-4fbd-4c22-90a9-ace54315889a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-zf8pk_kube-system(53dd6675-4fbd-4c22-90a9-ace54315889a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c92a1a90e60893ea5bef55e9abbd46f50814063be6c8a8809f263da63371d61\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-zf8pk" podUID="53dd6675-4fbd-4c22-90a9-ace54315889a" Dec 16 12:19:37.451806 containerd[1576]: time="2025-12-16T12:19:37.451718944Z" level=error msg="Failed to destroy network for sandbox \"8961c2fcaa2944a669c927033340ddea298ccb595061f701c0f0ecb8ce91bc50\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.454937 containerd[1576]: time="2025-12-16T12:19:37.454895692Z" level=error msg="Failed to destroy network for sandbox \"7460b23c8c555818c5e8d87dca9686473d42d3d45c7d4729774485f3cd9c7fdd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.456034 containerd[1576]: time="2025-12-16T12:19:37.455980528Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-849969496b-sljp9,Uid:78447202-a453-4865-9c1c-3b28d64baee4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8961c2fcaa2944a669c927033340ddea298ccb595061f701c0f0ecb8ce91bc50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.456259 kubelet[2821]: E1216 12:19:37.456216 2821 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8961c2fcaa2944a669c927033340ddea298ccb595061f701c0f0ecb8ce91bc50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.456341 kubelet[2821]: E1216 12:19:37.456272 2821 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8961c2fcaa2944a669c927033340ddea298ccb595061f701c0f0ecb8ce91bc50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-849969496b-sljp9" Dec 16 12:19:37.456341 kubelet[2821]: E1216 12:19:37.456297 2821 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8961c2fcaa2944a669c927033340ddea298ccb595061f701c0f0ecb8ce91bc50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-849969496b-sljp9" Dec 16 12:19:37.456546 kubelet[2821]: E1216 12:19:37.456352 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-849969496b-sljp9_calico-system(78447202-a453-4865-9c1c-3b28d64baee4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-849969496b-sljp9_calico-system(78447202-a453-4865-9c1c-3b28d64baee4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8961c2fcaa2944a669c927033340ddea298ccb595061f701c0f0ecb8ce91bc50\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-849969496b-sljp9" podUID="78447202-a453-4865-9c1c-3b28d64baee4" Dec 16 12:19:37.459890 containerd[1576]: time="2025-12-16T12:19:37.459822658Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85ddfd7b99-lzr5b,Uid:4dd8654f-30cc-4aed-bf7b-e1600d664a65,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7460b23c8c555818c5e8d87dca9686473d42d3d45c7d4729774485f3cd9c7fdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.460353 kubelet[2821]: E1216 12:19:37.460310 2821 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7460b23c8c555818c5e8d87dca9686473d42d3d45c7d4729774485f3cd9c7fdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.460553 kubelet[2821]: E1216 12:19:37.460369 2821 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7460b23c8c555818c5e8d87dca9686473d42d3d45c7d4729774485f3cd9c7fdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" Dec 16 12:19:37.460553 kubelet[2821]: E1216 12:19:37.460388 2821 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7460b23c8c555818c5e8d87dca9686473d42d3d45c7d4729774485f3cd9c7fdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" Dec 16 12:19:37.460553 kubelet[2821]: E1216 12:19:37.460443 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85ddfd7b99-lzr5b_calico-apiserver(4dd8654f-30cc-4aed-bf7b-e1600d664a65)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85ddfd7b99-lzr5b_calico-apiserver(4dd8654f-30cc-4aed-bf7b-e1600d664a65)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7460b23c8c555818c5e8d87dca9686473d42d3d45c7d4729774485f3cd9c7fdd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:19:37.464258 containerd[1576]: time="2025-12-16T12:19:37.463782192Z" level=error msg="Failed to destroy network for sandbox \"0ba6b173a0cf7112f078f0e4f993aaaa6299e3ead52141cebf197adfeac85573\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.469472 containerd[1576]: time="2025-12-16T12:19:37.469401142Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-85msp,Uid:6ad46257-20db-42d2-b357-93e753f0c2ca,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ba6b173a0cf7112f078f0e4f993aaaa6299e3ead52141cebf197adfeac85573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.470218 kubelet[2821]: E1216 12:19:37.469659 2821 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ba6b173a0cf7112f078f0e4f993aaaa6299e3ead52141cebf197adfeac85573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.470218 kubelet[2821]: E1216 12:19:37.469717 2821 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ba6b173a0cf7112f078f0e4f993aaaa6299e3ead52141cebf197adfeac85573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-85msp" Dec 16 12:19:37.470218 kubelet[2821]: E1216 12:19:37.469737 2821 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ba6b173a0cf7112f078f0e4f993aaaa6299e3ead52141cebf197adfeac85573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-85msp" Dec 16 12:19:37.470326 kubelet[2821]: E1216 12:19:37.469785 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-85msp_calico-system(6ad46257-20db-42d2-b357-93e753f0c2ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-85msp_calico-system(6ad46257-20db-42d2-b357-93e753f0c2ca)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ba6b173a0cf7112f078f0e4f993aaaa6299e3ead52141cebf197adfeac85573\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:19:37.475915 containerd[1576]: time="2025-12-16T12:19:37.475869281Z" level=error msg="Failed to destroy network for sandbox \"80c828eff3237dd5826c49ccaa4f6a1f1313eb5c374f8b252b667217b08f96cf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.478651 containerd[1576]: time="2025-12-16T12:19:37.478573733Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85ddfd7b99-s9l56,Uid:b4c4df94-c3ee-426c-b06d-ed9edc99469b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"80c828eff3237dd5826c49ccaa4f6a1f1313eb5c374f8b252b667217b08f96cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.479339 kubelet[2821]: E1216 12:19:37.479286 2821 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80c828eff3237dd5826c49ccaa4f6a1f1313eb5c374f8b252b667217b08f96cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:37.479824 kubelet[2821]: E1216 12:19:37.479795 2821 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80c828eff3237dd5826c49ccaa4f6a1f1313eb5c374f8b252b667217b08f96cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" Dec 16 12:19:37.479901 kubelet[2821]: E1216 12:19:37.479829 2821 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80c828eff3237dd5826c49ccaa4f6a1f1313eb5c374f8b252b667217b08f96cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" Dec 16 12:19:37.479901 kubelet[2821]: E1216 12:19:37.479879 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85ddfd7b99-s9l56_calico-apiserver(b4c4df94-c3ee-426c-b06d-ed9edc99469b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85ddfd7b99-s9l56_calico-apiserver(b4c4df94-c3ee-426c-b06d-ed9edc99469b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"80c828eff3237dd5826c49ccaa4f6a1f1313eb5c374f8b252b667217b08f96cf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:19:37.628568 containerd[1576]: time="2025-12-16T12:19:37.628433964Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 12:19:38.447911 systemd[1]: Created slice kubepods-besteffort-pod30279a80_ac32_4c4e_affe_8e2742945896.slice - libcontainer container kubepods-besteffort-pod30279a80_ac32_4c4e_affe_8e2742945896.slice. Dec 16 12:19:38.452982 containerd[1576]: time="2025-12-16T12:19:38.452789673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6nr2l,Uid:30279a80-ac32-4c4e-affe-8e2742945896,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:38.514391 containerd[1576]: time="2025-12-16T12:19:38.514116467Z" level=error msg="Failed to destroy network for sandbox \"37abab6df82e92bc2ac6bfdc1aef06c882bcf25526973a56b4915a71e41dc5fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:38.518073 systemd[1]: run-netns-cni\x2d3b9362c5\x2d27bd\x2d7669\x2d48dd\x2d422fcd1c08a8.mount: Deactivated successfully. Dec 16 12:19:38.522950 containerd[1576]: time="2025-12-16T12:19:38.522815995Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6nr2l,Uid:30279a80-ac32-4c4e-affe-8e2742945896,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"37abab6df82e92bc2ac6bfdc1aef06c882bcf25526973a56b4915a71e41dc5fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:38.523330 kubelet[2821]: E1216 12:19:38.523290 2821 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37abab6df82e92bc2ac6bfdc1aef06c882bcf25526973a56b4915a71e41dc5fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:19:38.524304 kubelet[2821]: E1216 12:19:38.523729 2821 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37abab6df82e92bc2ac6bfdc1aef06c882bcf25526973a56b4915a71e41dc5fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6nr2l" Dec 16 12:19:38.524304 kubelet[2821]: E1216 12:19:38.523771 2821 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37abab6df82e92bc2ac6bfdc1aef06c882bcf25526973a56b4915a71e41dc5fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6nr2l" Dec 16 12:19:38.524304 kubelet[2821]: E1216 12:19:38.523880 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"37abab6df82e92bc2ac6bfdc1aef06c882bcf25526973a56b4915a71e41dc5fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:19:42.273421 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1817954051.mount: Deactivated successfully. Dec 16 12:19:42.305269 containerd[1576]: time="2025-12-16T12:19:42.305180696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Dec 16 12:19:42.307705 containerd[1576]: time="2025-12-16T12:19:42.307653133Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 4.679120446s" Dec 16 12:19:42.307705 containerd[1576]: time="2025-12-16T12:19:42.307700894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 16 12:19:42.340649 containerd[1576]: time="2025-12-16T12:19:42.340577109Z" level=info msg="CreateContainer within sandbox \"5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 12:19:42.354664 containerd[1576]: time="2025-12-16T12:19:42.354566941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:42.355462 containerd[1576]: time="2025-12-16T12:19:42.355434007Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:42.356026 containerd[1576]: time="2025-12-16T12:19:42.355997665Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:19:42.365083 containerd[1576]: time="2025-12-16T12:19:42.365016023Z" level=info msg="Container d271b136715ab1c9591aa403000f4a69741e23ea42510087c4904e676d09a407: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:42.370374 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4232889861.mount: Deactivated successfully. Dec 16 12:19:42.394041 containerd[1576]: time="2025-12-16T12:19:42.393902715Z" level=info msg="CreateContainer within sandbox \"5f12ec92a87937ac8749df3efb24f04fb11524d8878e31738bdec2af2ab299ef\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"d271b136715ab1c9591aa403000f4a69741e23ea42510087c4904e676d09a407\"" Dec 16 12:19:42.394986 containerd[1576]: time="2025-12-16T12:19:42.394777782Z" level=info msg="StartContainer for \"d271b136715ab1c9591aa403000f4a69741e23ea42510087c4904e676d09a407\"" Dec 16 12:19:42.398504 containerd[1576]: time="2025-12-16T12:19:42.398232368Z" level=info msg="connecting to shim d271b136715ab1c9591aa403000f4a69741e23ea42510087c4904e676d09a407" address="unix:///run/containerd/s/e8d77f0605ce36c6a53a658a13c530d642220b2839f6c975169663d7a1618770" protocol=ttrpc version=3 Dec 16 12:19:42.477906 systemd[1]: Started cri-containerd-d271b136715ab1c9591aa403000f4a69741e23ea42510087c4904e676d09a407.scope - libcontainer container d271b136715ab1c9591aa403000f4a69741e23ea42510087c4904e676d09a407. Dec 16 12:19:42.538000 audit: BPF prog-id=170 op=LOAD Dec 16 12:19:42.539863 kernel: kauditd_printk_skb: 6 callbacks suppressed Dec 16 12:19:42.539908 kernel: audit: type=1334 audit(1765887582.538:562): prog-id=170 op=LOAD Dec 16 12:19:42.538000 audit[3807]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=3323 pid=3807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:42.543428 kernel: audit: type=1300 audit(1765887582.538:562): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=3323 pid=3807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:42.538000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373162313336373135616231633935393161613430333030306634 Dec 16 12:19:42.547400 kernel: audit: type=1327 audit(1765887582.538:562): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373162313336373135616231633935393161613430333030306634 Dec 16 12:19:42.547500 kernel: audit: type=1334 audit(1765887582.542:563): prog-id=171 op=LOAD Dec 16 12:19:42.547689 kernel: audit: type=1300 audit(1765887582.542:563): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=3323 pid=3807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:42.542000 audit: BPF prog-id=171 op=LOAD Dec 16 12:19:42.542000 audit[3807]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=3323 pid=3807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:42.542000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373162313336373135616231633935393161613430333030306634 Dec 16 12:19:42.551957 kernel: audit: type=1327 audit(1765887582.542:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373162313336373135616231633935393161613430333030306634 Dec 16 12:19:42.543000 audit: BPF prog-id=171 op=UNLOAD Dec 16 12:19:42.552749 kernel: audit: type=1334 audit(1765887582.543:564): prog-id=171 op=UNLOAD Dec 16 12:19:42.552836 kernel: audit: type=1300 audit(1765887582.543:564): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:42.543000 audit[3807]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:42.543000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373162313336373135616231633935393161613430333030306634 Dec 16 12:19:42.556765 kernel: audit: type=1327 audit(1765887582.543:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373162313336373135616231633935393161613430333030306634 Dec 16 12:19:42.543000 audit: BPF prog-id=170 op=UNLOAD Dec 16 12:19:42.557826 kernel: audit: type=1334 audit(1765887582.543:565): prog-id=170 op=UNLOAD Dec 16 12:19:42.543000 audit[3807]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3323 pid=3807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:42.543000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373162313336373135616231633935393161613430333030306634 Dec 16 12:19:42.543000 audit: BPF prog-id=172 op=LOAD Dec 16 12:19:42.543000 audit[3807]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=3323 pid=3807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:42.543000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373162313336373135616231633935393161613430333030306634 Dec 16 12:19:42.578904 containerd[1576]: time="2025-12-16T12:19:42.578843863Z" level=info msg="StartContainer for \"d271b136715ab1c9591aa403000f4a69741e23ea42510087c4904e676d09a407\" returns successfully" Dec 16 12:19:42.792998 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 12:19:42.793264 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 12:19:42.964478 kubelet[2821]: I1216 12:19:42.964389 2821 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-s82gj" podStartSLOduration=1.941617216 podStartE2EDuration="14.964366241s" podCreationTimestamp="2025-12-16 12:19:28 +0000 UTC" firstStartedPulling="2025-12-16 12:19:29.285774055 +0000 UTC m=+27.021343897" lastFinishedPulling="2025-12-16 12:19:42.30852308 +0000 UTC m=+40.044092922" observedRunningTime="2025-12-16 12:19:42.695899035 +0000 UTC m=+40.431468877" watchObservedRunningTime="2025-12-16 12:19:42.964366241 +0000 UTC m=+40.699936203" Dec 16 12:19:43.009305 kubelet[2821]: I1216 12:19:43.009197 2821 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/78447202-a453-4865-9c1c-3b28d64baee4-whisker-backend-key-pair\") pod \"78447202-a453-4865-9c1c-3b28d64baee4\" (UID: \"78447202-a453-4865-9c1c-3b28d64baee4\") " Dec 16 12:19:43.009457 kubelet[2821]: I1216 12:19:43.009336 2821 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2nrr\" (UniqueName: \"kubernetes.io/projected/78447202-a453-4865-9c1c-3b28d64baee4-kube-api-access-h2nrr\") pod \"78447202-a453-4865-9c1c-3b28d64baee4\" (UID: \"78447202-a453-4865-9c1c-3b28d64baee4\") " Dec 16 12:19:43.009457 kubelet[2821]: I1216 12:19:43.009367 2821 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78447202-a453-4865-9c1c-3b28d64baee4-whisker-ca-bundle\") pod \"78447202-a453-4865-9c1c-3b28d64baee4\" (UID: \"78447202-a453-4865-9c1c-3b28d64baee4\") " Dec 16 12:19:43.010846 kubelet[2821]: I1216 12:19:43.009816 2821 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78447202-a453-4865-9c1c-3b28d64baee4-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "78447202-a453-4865-9c1c-3b28d64baee4" (UID: "78447202-a453-4865-9c1c-3b28d64baee4"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 12:19:43.013978 kubelet[2821]: I1216 12:19:43.013839 2821 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78447202-a453-4865-9c1c-3b28d64baee4-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "78447202-a453-4865-9c1c-3b28d64baee4" (UID: "78447202-a453-4865-9c1c-3b28d64baee4"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 12:19:43.017453 kubelet[2821]: I1216 12:19:43.017368 2821 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78447202-a453-4865-9c1c-3b28d64baee4-kube-api-access-h2nrr" (OuterVolumeSpecName: "kube-api-access-h2nrr") pod "78447202-a453-4865-9c1c-3b28d64baee4" (UID: "78447202-a453-4865-9c1c-3b28d64baee4"). InnerVolumeSpecName "kube-api-access-h2nrr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 12:19:43.110817 kubelet[2821]: I1216 12:19:43.110741 2821 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/78447202-a453-4865-9c1c-3b28d64baee4-whisker-backend-key-pair\") on node \"ci-4547-0-0-5-8fe0b910ae\" DevicePath \"\"" Dec 16 12:19:43.110817 kubelet[2821]: I1216 12:19:43.110784 2821 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h2nrr\" (UniqueName: \"kubernetes.io/projected/78447202-a453-4865-9c1c-3b28d64baee4-kube-api-access-h2nrr\") on node \"ci-4547-0-0-5-8fe0b910ae\" DevicePath \"\"" Dec 16 12:19:43.110817 kubelet[2821]: I1216 12:19:43.110796 2821 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78447202-a453-4865-9c1c-3b28d64baee4-whisker-ca-bundle\") on node \"ci-4547-0-0-5-8fe0b910ae\" DevicePath \"\"" Dec 16 12:19:43.274904 systemd[1]: var-lib-kubelet-pods-78447202\x2da453\x2d4865\x2d9c1c\x2d3b28d64baee4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dh2nrr.mount: Deactivated successfully. Dec 16 12:19:43.275589 systemd[1]: var-lib-kubelet-pods-78447202\x2da453\x2d4865\x2d9c1c\x2d3b28d64baee4-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 12:19:43.663424 kubelet[2821]: I1216 12:19:43.663148 2821 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:19:43.670746 systemd[1]: Removed slice kubepods-besteffort-pod78447202_a453_4865_9c1c_3b28d64baee4.slice - libcontainer container kubepods-besteffort-pod78447202_a453_4865_9c1c_3b28d64baee4.slice. Dec 16 12:19:43.771412 systemd[1]: Created slice kubepods-besteffort-podb32c8c20_6807_4c19_8ec5_b6f0be7cc07e.slice - libcontainer container kubepods-besteffort-podb32c8c20_6807_4c19_8ec5_b6f0be7cc07e.slice. Dec 16 12:19:43.817540 kubelet[2821]: I1216 12:19:43.817408 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp8ff\" (UniqueName: \"kubernetes.io/projected/b32c8c20-6807-4c19-8ec5-b6f0be7cc07e-kube-api-access-sp8ff\") pod \"whisker-85dd648564-9wttf\" (UID: \"b32c8c20-6807-4c19-8ec5-b6f0be7cc07e\") " pod="calico-system/whisker-85dd648564-9wttf" Dec 16 12:19:43.817922 kubelet[2821]: I1216 12:19:43.817660 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b32c8c20-6807-4c19-8ec5-b6f0be7cc07e-whisker-ca-bundle\") pod \"whisker-85dd648564-9wttf\" (UID: \"b32c8c20-6807-4c19-8ec5-b6f0be7cc07e\") " pod="calico-system/whisker-85dd648564-9wttf" Dec 16 12:19:43.817922 kubelet[2821]: I1216 12:19:43.817697 2821 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b32c8c20-6807-4c19-8ec5-b6f0be7cc07e-whisker-backend-key-pair\") pod \"whisker-85dd648564-9wttf\" (UID: \"b32c8c20-6807-4c19-8ec5-b6f0be7cc07e\") " pod="calico-system/whisker-85dd648564-9wttf" Dec 16 12:19:44.078721 containerd[1576]: time="2025-12-16T12:19:44.078426344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85dd648564-9wttf,Uid:b32c8c20-6807-4c19-8ec5-b6f0be7cc07e,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:44.295856 systemd-networkd[1470]: cali9775341b303: Link UP Dec 16 12:19:44.296104 systemd-networkd[1470]: cali9775341b303: Gained carrier Dec 16 12:19:44.330249 containerd[1576]: 2025-12-16 12:19:44.107 [INFO][3872] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:19:44.330249 containerd[1576]: 2025-12-16 12:19:44.163 [INFO][3872] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0 whisker-85dd648564- calico-system b32c8c20-6807-4c19-8ec5-b6f0be7cc07e 879 0 2025-12-16 12:19:43 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:85dd648564 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-5-8fe0b910ae whisker-85dd648564-9wttf eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali9775341b303 [] [] }} ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Namespace="calico-system" Pod="whisker-85dd648564-9wttf" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-" Dec 16 12:19:44.330249 containerd[1576]: 2025-12-16 12:19:44.163 [INFO][3872] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Namespace="calico-system" Pod="whisker-85dd648564-9wttf" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" Dec 16 12:19:44.330249 containerd[1576]: 2025-12-16 12:19:44.215 [INFO][3883] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" HandleID="k8s-pod-network.159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" Dec 16 12:19:44.330501 containerd[1576]: 2025-12-16 12:19:44.215 [INFO][3883] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" HandleID="k8s-pod-network.159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d35e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-5-8fe0b910ae", "pod":"whisker-85dd648564-9wttf", "timestamp":"2025-12-16 12:19:44.215374082 +0000 UTC"}, Hostname:"ci-4547-0-0-5-8fe0b910ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:19:44.330501 containerd[1576]: 2025-12-16 12:19:44.215 [INFO][3883] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:19:44.330501 containerd[1576]: 2025-12-16 12:19:44.215 [INFO][3883] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:19:44.330501 containerd[1576]: 2025-12-16 12:19:44.215 [INFO][3883] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-8fe0b910ae' Dec 16 12:19:44.330501 containerd[1576]: 2025-12-16 12:19:44.231 [INFO][3883] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:44.330501 containerd[1576]: 2025-12-16 12:19:44.239 [INFO][3883] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:44.330501 containerd[1576]: 2025-12-16 12:19:44.245 [INFO][3883] ipam/ipam.go 511: Trying affinity for 192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:44.330501 containerd[1576]: 2025-12-16 12:19:44.249 [INFO][3883] ipam/ipam.go 158: Attempting to load block cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:44.330501 containerd[1576]: 2025-12-16 12:19:44.252 [INFO][3883] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:44.331529 containerd[1576]: 2025-12-16 12:19:44.252 [INFO][3883] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:44.331529 containerd[1576]: 2025-12-16 12:19:44.255 [INFO][3883] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d Dec 16 12:19:44.331529 containerd[1576]: 2025-12-16 12:19:44.262 [INFO][3883] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:44.331529 containerd[1576]: 2025-12-16 12:19:44.269 [INFO][3883] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.126.65/26] block=192.168.126.64/26 handle="k8s-pod-network.159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:44.331529 containerd[1576]: 2025-12-16 12:19:44.270 [INFO][3883] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.126.65/26] handle="k8s-pod-network.159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:44.331529 containerd[1576]: 2025-12-16 12:19:44.270 [INFO][3883] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:19:44.331529 containerd[1576]: 2025-12-16 12:19:44.270 [INFO][3883] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.126.65/26] IPv6=[] ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" HandleID="k8s-pod-network.159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" Dec 16 12:19:44.331691 containerd[1576]: 2025-12-16 12:19:44.275 [INFO][3872] cni-plugin/k8s.go 418: Populated endpoint ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Namespace="calico-system" Pod="whisker-85dd648564-9wttf" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0", GenerateName:"whisker-85dd648564-", Namespace:"calico-system", SelfLink:"", UID:"b32c8c20-6807-4c19-8ec5-b6f0be7cc07e", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85dd648564", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"", Pod:"whisker-85dd648564-9wttf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.126.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9775341b303", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:44.331691 containerd[1576]: 2025-12-16 12:19:44.276 [INFO][3872] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.65/32] ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Namespace="calico-system" Pod="whisker-85dd648564-9wttf" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" Dec 16 12:19:44.331776 containerd[1576]: 2025-12-16 12:19:44.276 [INFO][3872] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9775341b303 ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Namespace="calico-system" Pod="whisker-85dd648564-9wttf" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" Dec 16 12:19:44.331776 containerd[1576]: 2025-12-16 12:19:44.294 [INFO][3872] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Namespace="calico-system" Pod="whisker-85dd648564-9wttf" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" Dec 16 12:19:44.331820 containerd[1576]: 2025-12-16 12:19:44.298 [INFO][3872] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Namespace="calico-system" Pod="whisker-85dd648564-9wttf" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0", GenerateName:"whisker-85dd648564-", Namespace:"calico-system", SelfLink:"", UID:"b32c8c20-6807-4c19-8ec5-b6f0be7cc07e", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85dd648564", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d", Pod:"whisker-85dd648564-9wttf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.126.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9775341b303", MAC:"26:0b:78:bd:38:77", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:44.331873 containerd[1576]: 2025-12-16 12:19:44.324 [INFO][3872] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" Namespace="calico-system" Pod="whisker-85dd648564-9wttf" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-whisker--85dd648564--9wttf-eth0" Dec 16 12:19:44.444649 containerd[1576]: time="2025-12-16T12:19:44.443591670Z" level=info msg="connecting to shim 159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d" address="unix:///run/containerd/s/e173527ac3bc0617f5add5d59e43e0f6994f956aadf07977da20ebf18fdeb886" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:44.449424 kubelet[2821]: I1216 12:19:44.449379 2821 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78447202-a453-4865-9c1c-3b28d64baee4" path="/var/lib/kubelet/pods/78447202-a453-4865-9c1c-3b28d64baee4/volumes" Dec 16 12:19:44.504893 systemd[1]: Started cri-containerd-159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d.scope - libcontainer container 159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d. Dec 16 12:19:44.526000 audit: BPF prog-id=173 op=LOAD Dec 16 12:19:44.527000 audit: BPF prog-id=174 op=LOAD Dec 16 12:19:44.527000 audit[3996]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3977 pid=3996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:44.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135393732366163616230383334616466653064646636386332633332 Dec 16 12:19:44.527000 audit: BPF prog-id=174 op=UNLOAD Dec 16 12:19:44.527000 audit[3996]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3977 pid=3996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:44.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135393732366163616230383334616466653064646636386332633332 Dec 16 12:19:44.527000 audit: BPF prog-id=175 op=LOAD Dec 16 12:19:44.527000 audit[3996]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3977 pid=3996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:44.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135393732366163616230383334616466653064646636386332633332 Dec 16 12:19:44.527000 audit: BPF prog-id=176 op=LOAD Dec 16 12:19:44.527000 audit[3996]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3977 pid=3996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:44.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135393732366163616230383334616466653064646636386332633332 Dec 16 12:19:44.527000 audit: BPF prog-id=176 op=UNLOAD Dec 16 12:19:44.527000 audit[3996]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3977 pid=3996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:44.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135393732366163616230383334616466653064646636386332633332 Dec 16 12:19:44.527000 audit: BPF prog-id=175 op=UNLOAD Dec 16 12:19:44.527000 audit[3996]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3977 pid=3996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:44.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135393732366163616230383334616466653064646636386332633332 Dec 16 12:19:44.527000 audit: BPF prog-id=177 op=LOAD Dec 16 12:19:44.527000 audit[3996]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3977 pid=3996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:44.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3135393732366163616230383334616466653064646636386332633332 Dec 16 12:19:44.605511 containerd[1576]: time="2025-12-16T12:19:44.605392710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85dd648564-9wttf,Uid:b32c8c20-6807-4c19-8ec5-b6f0be7cc07e,Namespace:calico-system,Attempt:0,} returns sandbox id \"159726acab0834adfe0ddf68c2c32c431731f37479f0000e3dc439e4b5c8f41d\"" Dec 16 12:19:44.608554 containerd[1576]: time="2025-12-16T12:19:44.608473843Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:19:44.957401 containerd[1576]: time="2025-12-16T12:19:44.957313399Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:44.958789 containerd[1576]: time="2025-12-16T12:19:44.958739642Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:19:44.958951 containerd[1576]: time="2025-12-16T12:19:44.958759403Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:44.959032 kubelet[2821]: E1216 12:19:44.958989 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:19:44.959119 kubelet[2821]: E1216 12:19:44.959041 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:19:44.964375 kubelet[2821]: E1216 12:19:44.964311 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:44.966272 containerd[1576]: time="2025-12-16T12:19:44.966210706Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:19:45.379358 containerd[1576]: time="2025-12-16T12:19:45.378971333Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:45.380913 containerd[1576]: time="2025-12-16T12:19:45.380753105Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:19:45.381035 containerd[1576]: time="2025-12-16T12:19:45.380853028Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:45.381358 kubelet[2821]: E1216 12:19:45.381300 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:19:45.381645 kubelet[2821]: E1216 12:19:45.381542 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:19:45.382033 kubelet[2821]: E1216 12:19:45.381913 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:45.382206 kubelet[2821]: E1216 12:19:45.382158 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:19:45.675441 kubelet[2821]: E1216 12:19:45.675179 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:19:45.707000 audit[4039]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=4039 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:45.707000 audit[4039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffc48b1f70 a2=0 a3=1 items=0 ppid=2972 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:45.707000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:45.714000 audit[4039]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=4039 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:45.714000 audit[4039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc48b1f70 a2=0 a3=1 items=0 ppid=2972 pid=4039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:45.714000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:45.964026 systemd-networkd[1470]: cali9775341b303: Gained IPv6LL Dec 16 12:19:47.503000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-46.224.130.63:22-138.68.91.238:40782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:47.504035 systemd[1]: Started sshd@7-46.224.130.63:22-138.68.91.238:40782.service - OpenSSH per-connection server daemon (138.68.91.238:40782). Dec 16 12:19:48.443978 containerd[1576]: time="2025-12-16T12:19:48.443920074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zf8pk,Uid:53dd6675-4fbd-4c22-90a9-ace54315889a,Namespace:kube-system,Attempt:0,}" Dec 16 12:19:48.596000 systemd-networkd[1470]: calie783b953121: Link UP Dec 16 12:19:48.596695 systemd-networkd[1470]: calie783b953121: Gained carrier Dec 16 12:19:48.614531 containerd[1576]: 2025-12-16 12:19:48.479 [INFO][4108] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:19:48.614531 containerd[1576]: 2025-12-16 12:19:48.499 [INFO][4108] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0 coredns-66bc5c9577- kube-system 53dd6675-4fbd-4c22-90a9-ace54315889a 806 0 2025-12-16 12:19:08 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-5-8fe0b910ae coredns-66bc5c9577-zf8pk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie783b953121 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Namespace="kube-system" Pod="coredns-66bc5c9577-zf8pk" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-" Dec 16 12:19:48.614531 containerd[1576]: 2025-12-16 12:19:48.500 [INFO][4108] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Namespace="kube-system" Pod="coredns-66bc5c9577-zf8pk" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" Dec 16 12:19:48.614531 containerd[1576]: 2025-12-16 12:19:48.532 [INFO][4121] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" HandleID="k8s-pod-network.f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" Dec 16 12:19:48.614894 containerd[1576]: 2025-12-16 12:19:48.533 [INFO][4121] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" HandleID="k8s-pod-network.f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b000), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-5-8fe0b910ae", "pod":"coredns-66bc5c9577-zf8pk", "timestamp":"2025-12-16 12:19:48.532910678 +0000 UTC"}, Hostname:"ci-4547-0-0-5-8fe0b910ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:19:48.614894 containerd[1576]: 2025-12-16 12:19:48.533 [INFO][4121] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:19:48.614894 containerd[1576]: 2025-12-16 12:19:48.533 [INFO][4121] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:19:48.614894 containerd[1576]: 2025-12-16 12:19:48.533 [INFO][4121] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-8fe0b910ae' Dec 16 12:19:48.614894 containerd[1576]: 2025-12-16 12:19:48.545 [INFO][4121] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:48.614894 containerd[1576]: 2025-12-16 12:19:48.552 [INFO][4121] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:48.614894 containerd[1576]: 2025-12-16 12:19:48.560 [INFO][4121] ipam/ipam.go 511: Trying affinity for 192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:48.614894 containerd[1576]: 2025-12-16 12:19:48.563 [INFO][4121] ipam/ipam.go 158: Attempting to load block cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:48.614894 containerd[1576]: 2025-12-16 12:19:48.566 [INFO][4121] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:48.615351 containerd[1576]: 2025-12-16 12:19:48.566 [INFO][4121] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:48.615351 containerd[1576]: 2025-12-16 12:19:48.571 [INFO][4121] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0 Dec 16 12:19:48.615351 containerd[1576]: 2025-12-16 12:19:48.576 [INFO][4121] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:48.615351 containerd[1576]: 2025-12-16 12:19:48.586 [INFO][4121] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.126.66/26] block=192.168.126.64/26 handle="k8s-pod-network.f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:48.615351 containerd[1576]: 2025-12-16 12:19:48.586 [INFO][4121] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.126.66/26] handle="k8s-pod-network.f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:48.615351 containerd[1576]: 2025-12-16 12:19:48.586 [INFO][4121] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:19:48.615351 containerd[1576]: 2025-12-16 12:19:48.587 [INFO][4121] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.126.66/26] IPv6=[] ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" HandleID="k8s-pod-network.f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" Dec 16 12:19:48.616107 containerd[1576]: 2025-12-16 12:19:48.590 [INFO][4108] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Namespace="kube-system" Pod="coredns-66bc5c9577-zf8pk" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"53dd6675-4fbd-4c22-90a9-ace54315889a", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"", Pod:"coredns-66bc5c9577-zf8pk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.126.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie783b953121", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:48.616107 containerd[1576]: 2025-12-16 12:19:48.590 [INFO][4108] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.66/32] ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Namespace="kube-system" Pod="coredns-66bc5c9577-zf8pk" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" Dec 16 12:19:48.616107 containerd[1576]: 2025-12-16 12:19:48.590 [INFO][4108] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie783b953121 ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Namespace="kube-system" Pod="coredns-66bc5c9577-zf8pk" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" Dec 16 12:19:48.616107 containerd[1576]: 2025-12-16 12:19:48.597 [INFO][4108] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Namespace="kube-system" Pod="coredns-66bc5c9577-zf8pk" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" Dec 16 12:19:48.616107 containerd[1576]: 2025-12-16 12:19:48.597 [INFO][4108] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Namespace="kube-system" Pod="coredns-66bc5c9577-zf8pk" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"53dd6675-4fbd-4c22-90a9-ace54315889a", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0", Pod:"coredns-66bc5c9577-zf8pk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.126.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie783b953121", MAC:"9a:f5:65:2b:b4:e5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:48.616986 containerd[1576]: 2025-12-16 12:19:48.609 [INFO][4108] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" Namespace="kube-system" Pod="coredns-66bc5c9577-zf8pk" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--zf8pk-eth0" Dec 16 12:19:48.644171 containerd[1576]: time="2025-12-16T12:19:48.644120632Z" level=info msg="connecting to shim f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0" address="unix:///run/containerd/s/cc9b44bb85a92e0c636801594c0746ca9f9c7768a243cfb73b8928d5f9538822" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:48.682076 systemd[1]: Started cri-containerd-f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0.scope - libcontainer container f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0. Dec 16 12:19:48.696000 audit: BPF prog-id=178 op=LOAD Dec 16 12:19:48.698458 kernel: kauditd_printk_skb: 34 callbacks suppressed Dec 16 12:19:48.698572 kernel: audit: type=1334 audit(1765887588.696:578): prog-id=178 op=LOAD Dec 16 12:19:48.698000 audit: BPF prog-id=179 op=LOAD Dec 16 12:19:48.698000 audit[4154]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.702635 kernel: audit: type=1334 audit(1765887588.698:579): prog-id=179 op=LOAD Dec 16 12:19:48.702741 kernel: audit: type=1300 audit(1765887588.698:579): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.702769 kernel: audit: type=1327 audit(1765887588.698:579): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.699000 audit: BPF prog-id=179 op=UNLOAD Dec 16 12:19:48.705751 kernel: audit: type=1334 audit(1765887588.699:580): prog-id=179 op=UNLOAD Dec 16 12:19:48.705782 kernel: audit: type=1300 audit(1765887588.699:580): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.699000 audit[4154]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.707710 kernel: audit: type=1327 audit(1765887588.699:580): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.699000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.699000 audit: BPF prog-id=180 op=LOAD Dec 16 12:19:48.699000 audit[4154]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.712432 kernel: audit: type=1334 audit(1765887588.699:581): prog-id=180 op=LOAD Dec 16 12:19:48.712512 kernel: audit: type=1300 audit(1765887588.699:581): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.699000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.714693 kernel: audit: type=1327 audit(1765887588.699:581): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.702000 audit: BPF prog-id=181 op=LOAD Dec 16 12:19:48.702000 audit[4154]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.702000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.704000 audit: BPF prog-id=181 op=UNLOAD Dec 16 12:19:48.704000 audit[4154]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.704000 audit: BPF prog-id=180 op=UNLOAD Dec 16 12:19:48.704000 audit[4154]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.704000 audit: BPF prog-id=182 op=LOAD Dec 16 12:19:48.704000 audit[4154]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4143 pid=4154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638333338666363363734656263393731376535633333646334393462 Dec 16 12:19:48.738890 containerd[1576]: time="2025-12-16T12:19:48.738807637Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zf8pk,Uid:53dd6675-4fbd-4c22-90a9-ace54315889a,Namespace:kube-system,Attempt:0,} returns sandbox id \"f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0\"" Dec 16 12:19:48.750513 containerd[1576]: time="2025-12-16T12:19:48.750366964Z" level=info msg="CreateContainer within sandbox \"f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:19:48.768871 containerd[1576]: time="2025-12-16T12:19:48.768816768Z" level=info msg="Container 0bf932387aa71541afe9b34c8bd6fecc9893d72810205a10de39443bfe192a66: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:48.777648 containerd[1576]: time="2025-12-16T12:19:48.777525815Z" level=info msg="CreateContainer within sandbox \"f8338fcc674ebc9717e5c33dc494be2f4d9752f861dee1561d5e29268f33dfd0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0bf932387aa71541afe9b34c8bd6fecc9893d72810205a10de39443bfe192a66\"" Dec 16 12:19:48.782398 containerd[1576]: time="2025-12-16T12:19:48.781436485Z" level=info msg="StartContainer for \"0bf932387aa71541afe9b34c8bd6fecc9893d72810205a10de39443bfe192a66\"" Dec 16 12:19:48.783590 containerd[1576]: time="2025-12-16T12:19:48.783495464Z" level=info msg="connecting to shim 0bf932387aa71541afe9b34c8bd6fecc9893d72810205a10de39443bfe192a66" address="unix:///run/containerd/s/cc9b44bb85a92e0c636801594c0746ca9f9c7768a243cfb73b8928d5f9538822" protocol=ttrpc version=3 Dec 16 12:19:48.809071 systemd[1]: Started cri-containerd-0bf932387aa71541afe9b34c8bd6fecc9893d72810205a10de39443bfe192a66.scope - libcontainer container 0bf932387aa71541afe9b34c8bd6fecc9893d72810205a10de39443bfe192a66. Dec 16 12:19:48.823000 audit: BPF prog-id=183 op=LOAD Dec 16 12:19:48.824000 audit: BPF prog-id=184 op=LOAD Dec 16 12:19:48.824000 audit[4180]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4143 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.824000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062663933323338376161373135343161666539623334633862643666 Dec 16 12:19:48.824000 audit: BPF prog-id=184 op=UNLOAD Dec 16 12:19:48.824000 audit[4180]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4143 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.824000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062663933323338376161373135343161666539623334633862643666 Dec 16 12:19:48.824000 audit: BPF prog-id=185 op=LOAD Dec 16 12:19:48.824000 audit[4180]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4143 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.824000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062663933323338376161373135343161666539623334633862643666 Dec 16 12:19:48.825000 audit: BPF prog-id=186 op=LOAD Dec 16 12:19:48.825000 audit[4180]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4143 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.825000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062663933323338376161373135343161666539623334633862643666 Dec 16 12:19:48.825000 audit: BPF prog-id=186 op=UNLOAD Dec 16 12:19:48.825000 audit[4180]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4143 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.825000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062663933323338376161373135343161666539623334633862643666 Dec 16 12:19:48.825000 audit: BPF prog-id=185 op=UNLOAD Dec 16 12:19:48.825000 audit[4180]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4143 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.825000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062663933323338376161373135343161666539623334633862643666 Dec 16 12:19:48.825000 audit: BPF prog-id=187 op=LOAD Dec 16 12:19:48.825000 audit[4180]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4143 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:48.825000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062663933323338376161373135343161666539623334633862643666 Dec 16 12:19:48.848981 containerd[1576]: time="2025-12-16T12:19:48.848933440Z" level=info msg="StartContainer for \"0bf932387aa71541afe9b34c8bd6fecc9893d72810205a10de39443bfe192a66\" returns successfully" Dec 16 12:19:49.443609 containerd[1576]: time="2025-12-16T12:19:49.443523794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85ddfd7b99-s9l56,Uid:b4c4df94-c3ee-426c-b06d-ed9edc99469b,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:19:49.445267 containerd[1576]: time="2025-12-16T12:19:49.445224562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d865fff8d-bxz6x,Uid:bbfa367f-11d3-466c-9181-c6ee23836f5f,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:49.643607 systemd-networkd[1470]: calie27493fd194: Link UP Dec 16 12:19:49.646794 systemd-networkd[1470]: calie27493fd194: Gained carrier Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.497 [INFO][4238] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.518 [INFO][4238] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0 calico-apiserver-85ddfd7b99- calico-apiserver b4c4df94-c3ee-426c-b06d-ed9edc99469b 810 0 2025-12-16 12:19:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85ddfd7b99 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-5-8fe0b910ae calico-apiserver-85ddfd7b99-s9l56 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie27493fd194 [] [] }} ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-s9l56" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.518 [INFO][4238] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-s9l56" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.555 [INFO][4262] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" HandleID="k8s-pod-network.f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.555 [INFO][4262] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" HandleID="k8s-pod-network.f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b200), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-5-8fe0b910ae", "pod":"calico-apiserver-85ddfd7b99-s9l56", "timestamp":"2025-12-16 12:19:49.555235605 +0000 UTC"}, Hostname:"ci-4547-0-0-5-8fe0b910ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.555 [INFO][4262] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.555 [INFO][4262] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.555 [INFO][4262] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-8fe0b910ae' Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.574 [INFO][4262] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.583 [INFO][4262] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.590 [INFO][4262] ipam/ipam.go 511: Trying affinity for 192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.593 [INFO][4262] ipam/ipam.go 158: Attempting to load block cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.596 [INFO][4262] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.596 [INFO][4262] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.599 [INFO][4262] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.607 [INFO][4262] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.622 [INFO][4262] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.126.67/26] block=192.168.126.64/26 handle="k8s-pod-network.f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.622 [INFO][4262] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.126.67/26] handle="k8s-pod-network.f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.622 [INFO][4262] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:19:49.675024 containerd[1576]: 2025-12-16 12:19:49.624 [INFO][4262] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.126.67/26] IPv6=[] ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" HandleID="k8s-pod-network.f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" Dec 16 12:19:49.676281 containerd[1576]: 2025-12-16 12:19:49.628 [INFO][4238] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-s9l56" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0", GenerateName:"calico-apiserver-85ddfd7b99-", Namespace:"calico-apiserver", SelfLink:"", UID:"b4c4df94-c3ee-426c-b06d-ed9edc99469b", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85ddfd7b99", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"", Pod:"calico-apiserver-85ddfd7b99-s9l56", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.126.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie27493fd194", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:49.676281 containerd[1576]: 2025-12-16 12:19:49.628 [INFO][4238] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.67/32] ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-s9l56" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" Dec 16 12:19:49.676281 containerd[1576]: 2025-12-16 12:19:49.628 [INFO][4238] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie27493fd194 ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-s9l56" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" Dec 16 12:19:49.676281 containerd[1576]: 2025-12-16 12:19:49.648 [INFO][4238] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-s9l56" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" Dec 16 12:19:49.676281 containerd[1576]: 2025-12-16 12:19:49.650 [INFO][4238] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-s9l56" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0", GenerateName:"calico-apiserver-85ddfd7b99-", Namespace:"calico-apiserver", SelfLink:"", UID:"b4c4df94-c3ee-426c-b06d-ed9edc99469b", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85ddfd7b99", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b", Pod:"calico-apiserver-85ddfd7b99-s9l56", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.126.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie27493fd194", MAC:"62:71:3e:5e:e6:11", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:49.676281 containerd[1576]: 2025-12-16 12:19:49.670 [INFO][4238] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-s9l56" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--s9l56-eth0" Dec 16 12:19:49.756415 containerd[1576]: time="2025-12-16T12:19:49.755444616Z" level=info msg="connecting to shim f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b" address="unix:///run/containerd/s/670d200150e13837a212d273b0b18bea438200bef88171f3ba59ea5eadb368be" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:49.794674 kubelet[2821]: I1216 12:19:49.792904 2821 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-zf8pk" podStartSLOduration=41.792885065 podStartE2EDuration="41.792885065s" podCreationTimestamp="2025-12-16 12:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:19:49.743525922 +0000 UTC m=+47.479095804" watchObservedRunningTime="2025-12-16 12:19:49.792885065 +0000 UTC m=+47.528454947" Dec 16 12:19:49.813897 systemd[1]: Started cri-containerd-f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b.scope - libcontainer container f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b. Dec 16 12:19:49.836537 systemd-networkd[1470]: cali79d5c99b2be: Link UP Dec 16 12:19:49.839565 systemd-networkd[1470]: cali79d5c99b2be: Gained carrier Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.501 [INFO][4243] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.527 [INFO][4243] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0 calico-kube-controllers-5d865fff8d- calico-system bbfa367f-11d3-466c-9181-c6ee23836f5f 814 0 2025-12-16 12:19:29 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5d865fff8d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-5-8fe0b910ae calico-kube-controllers-5d865fff8d-bxz6x eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali79d5c99b2be [] [] }} ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Namespace="calico-system" Pod="calico-kube-controllers-5d865fff8d-bxz6x" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.527 [INFO][4243] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Namespace="calico-system" Pod="calico-kube-controllers-5d865fff8d-bxz6x" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.565 [INFO][4267] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" HandleID="k8s-pod-network.64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.566 [INFO][4267] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" HandleID="k8s-pod-network.64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2fe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-5-8fe0b910ae", "pod":"calico-kube-controllers-5d865fff8d-bxz6x", "timestamp":"2025-12-16 12:19:49.565592175 +0000 UTC"}, Hostname:"ci-4547-0-0-5-8fe0b910ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.566 [INFO][4267] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.622 [INFO][4267] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.622 [INFO][4267] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-8fe0b910ae' Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.678 [INFO][4267] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.690 [INFO][4267] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.736 [INFO][4267] ipam/ipam.go 511: Trying affinity for 192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.751 [INFO][4267] ipam/ipam.go 158: Attempting to load block cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.761 [INFO][4267] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.761 [INFO][4267] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.771 [INFO][4267] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.810 [INFO][4267] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.827 [INFO][4267] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.126.68/26] block=192.168.126.64/26 handle="k8s-pod-network.64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.827 [INFO][4267] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.126.68/26] handle="k8s-pod-network.64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.828 [INFO][4267] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:19:49.860980 containerd[1576]: 2025-12-16 12:19:49.828 [INFO][4267] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.126.68/26] IPv6=[] ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" HandleID="k8s-pod-network.64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" Dec 16 12:19:49.862493 containerd[1576]: 2025-12-16 12:19:49.829 [INFO][4243] cni-plugin/k8s.go 418: Populated endpoint ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Namespace="calico-system" Pod="calico-kube-controllers-5d865fff8d-bxz6x" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0", GenerateName:"calico-kube-controllers-5d865fff8d-", Namespace:"calico-system", SelfLink:"", UID:"bbfa367f-11d3-466c-9181-c6ee23836f5f", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d865fff8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"", Pod:"calico-kube-controllers-5d865fff8d-bxz6x", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.126.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali79d5c99b2be", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:49.862493 containerd[1576]: 2025-12-16 12:19:49.829 [INFO][4243] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.68/32] ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Namespace="calico-system" Pod="calico-kube-controllers-5d865fff8d-bxz6x" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" Dec 16 12:19:49.862493 containerd[1576]: 2025-12-16 12:19:49.830 [INFO][4243] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali79d5c99b2be ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Namespace="calico-system" Pod="calico-kube-controllers-5d865fff8d-bxz6x" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" Dec 16 12:19:49.862493 containerd[1576]: 2025-12-16 12:19:49.839 [INFO][4243] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Namespace="calico-system" Pod="calico-kube-controllers-5d865fff8d-bxz6x" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" Dec 16 12:19:49.862493 containerd[1576]: 2025-12-16 12:19:49.840 [INFO][4243] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Namespace="calico-system" Pod="calico-kube-controllers-5d865fff8d-bxz6x" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0", GenerateName:"calico-kube-controllers-5d865fff8d-", Namespace:"calico-system", SelfLink:"", UID:"bbfa367f-11d3-466c-9181-c6ee23836f5f", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d865fff8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b", Pod:"calico-kube-controllers-5d865fff8d-bxz6x", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.126.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali79d5c99b2be", MAC:"7a:b1:f3:ef:15:fe", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:49.862493 containerd[1576]: 2025-12-16 12:19:49.856 [INFO][4243] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" Namespace="calico-system" Pod="calico-kube-controllers-5d865fff8d-bxz6x" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--kube--controllers--5d865fff8d--bxz6x-eth0" Dec 16 12:19:49.868590 systemd-networkd[1470]: calie783b953121: Gained IPv6LL Dec 16 12:19:49.905188 containerd[1576]: time="2025-12-16T12:19:49.905131811Z" level=info msg="connecting to shim 64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b" address="unix:///run/containerd/s/22a3edf1af4ea53204c2f858914cf614ccbd19edb75800de8c7bac079fc69483" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:49.907000 audit[4332]: NETFILTER_CFG table=filter:119 family=2 entries=22 op=nft_register_rule pid=4332 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:49.907000 audit[4332]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffcb4a6ba0 a2=0 a3=1 items=0 ppid=2972 pid=4332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.907000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:49.913000 audit[4332]: NETFILTER_CFG table=nat:120 family=2 entries=12 op=nft_register_rule pid=4332 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:49.913000 audit[4332]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcb4a6ba0 a2=0 a3=1 items=0 ppid=2972 pid=4332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.913000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:49.919000 audit: BPF prog-id=188 op=LOAD Dec 16 12:19:49.921000 audit: BPF prog-id=189 op=LOAD Dec 16 12:19:49.921000 audit[4303]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=4292 pid=4303 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632353537333739353634396134383261303766626264666437383039 Dec 16 12:19:49.921000 audit: BPF prog-id=189 op=UNLOAD Dec 16 12:19:49.921000 audit[4303]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4292 pid=4303 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632353537333739353634396134383261303766626264666437383039 Dec 16 12:19:49.921000 audit: BPF prog-id=190 op=LOAD Dec 16 12:19:49.921000 audit[4303]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=4292 pid=4303 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632353537333739353634396134383261303766626264666437383039 Dec 16 12:19:49.921000 audit: BPF prog-id=191 op=LOAD Dec 16 12:19:49.921000 audit[4303]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=4292 pid=4303 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632353537333739353634396134383261303766626264666437383039 Dec 16 12:19:49.922000 audit: BPF prog-id=191 op=UNLOAD Dec 16 12:19:49.922000 audit[4303]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4292 pid=4303 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632353537333739353634396134383261303766626264666437383039 Dec 16 12:19:49.922000 audit: BPF prog-id=190 op=UNLOAD Dec 16 12:19:49.922000 audit[4303]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4292 pid=4303 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632353537333739353634396134383261303766626264666437383039 Dec 16 12:19:49.922000 audit: BPF prog-id=192 op=LOAD Dec 16 12:19:49.922000 audit[4303]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=4292 pid=4303 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632353537333739353634396134383261303766626264666437383039 Dec 16 12:19:49.958930 systemd[1]: Started cri-containerd-64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b.scope - libcontainer container 64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b. Dec 16 12:19:49.974790 containerd[1576]: time="2025-12-16T12:19:49.974145545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85ddfd7b99-s9l56,Uid:b4c4df94-c3ee-426c-b06d-ed9edc99469b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f25573795649a482a07fbbdfd7809eda05c516eee336085287cd740edc1a111b\"" Dec 16 12:19:49.976000 audit: BPF prog-id=193 op=LOAD Dec 16 12:19:49.978542 containerd[1576]: time="2025-12-16T12:19:49.978474267Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:19:49.978000 audit: BPF prog-id=194 op=LOAD Dec 16 12:19:49.978000 audit[4349]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4338 pid=4349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613034633336653066343033323066373931323631343530383931 Dec 16 12:19:49.978000 audit: BPF prog-id=194 op=UNLOAD Dec 16 12:19:49.978000 audit[4349]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4338 pid=4349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613034633336653066343033323066373931323631343530383931 Dec 16 12:19:49.978000 audit: BPF prog-id=195 op=LOAD Dec 16 12:19:49.978000 audit[4349]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4338 pid=4349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613034633336653066343033323066373931323631343530383931 Dec 16 12:19:49.978000 audit: BPF prog-id=196 op=LOAD Dec 16 12:19:49.978000 audit[4349]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4338 pid=4349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613034633336653066343033323066373931323631343530383931 Dec 16 12:19:49.978000 audit: BPF prog-id=196 op=UNLOAD Dec 16 12:19:49.978000 audit[4349]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4338 pid=4349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613034633336653066343033323066373931323631343530383931 Dec 16 12:19:49.978000 audit: BPF prog-id=195 op=UNLOAD Dec 16 12:19:49.978000 audit[4349]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4338 pid=4349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613034633336653066343033323066373931323631343530383931 Dec 16 12:19:49.978000 audit: BPF prog-id=197 op=LOAD Dec 16 12:19:49.978000 audit[4349]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4338 pid=4349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:49.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634613034633336653066343033323066373931323631343530383931 Dec 16 12:19:50.027411 containerd[1576]: time="2025-12-16T12:19:50.026152235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d865fff8d-bxz6x,Uid:bbfa367f-11d3-466c-9181-c6ee23836f5f,Namespace:calico-system,Attempt:0,} returns sandbox id \"64a04c36e0f40320f791261450891441d8eb1289c291b04ff1f060326e5eab2b\"" Dec 16 12:19:50.322117 containerd[1576]: time="2025-12-16T12:19:50.321950153Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:50.323570 containerd[1576]: time="2025-12-16T12:19:50.323513676Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:19:50.324038 containerd[1576]: time="2025-12-16T12:19:50.323845846Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:50.324399 kubelet[2821]: E1216 12:19:50.324266 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:50.324399 kubelet[2821]: E1216 12:19:50.324375 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:50.326651 containerd[1576]: time="2025-12-16T12:19:50.324997918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:19:50.326763 kubelet[2821]: E1216 12:19:50.325069 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-s9l56_calico-apiserver(b4c4df94-c3ee-426c-b06d-ed9edc99469b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:50.326763 kubelet[2821]: E1216 12:19:50.325119 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:19:50.443417 containerd[1576]: time="2025-12-16T12:19:50.443300836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-v6jwv,Uid:2c875d65-0c1c-4f65-a11e-e47b73dda454,Namespace:kube-system,Attempt:0,}" Dec 16 12:19:50.599148 systemd-networkd[1470]: cali14a9997d8f9: Link UP Dec 16 12:19:50.600075 systemd-networkd[1470]: cali14a9997d8f9: Gained carrier Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.490 [INFO][4405] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.510 [INFO][4405] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0 coredns-66bc5c9577- kube-system 2c875d65-0c1c-4f65-a11e-e47b73dda454 813 0 2025-12-16 12:19:08 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-5-8fe0b910ae coredns-66bc5c9577-v6jwv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali14a9997d8f9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Namespace="kube-system" Pod="coredns-66bc5c9577-v6jwv" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.510 [INFO][4405] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Namespace="kube-system" Pod="coredns-66bc5c9577-v6jwv" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.543 [INFO][4416] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" HandleID="k8s-pod-network.5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.543 [INFO][4416] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" HandleID="k8s-pod-network.5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024af80), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-5-8fe0b910ae", "pod":"coredns-66bc5c9577-v6jwv", "timestamp":"2025-12-16 12:19:50.543524934 +0000 UTC"}, Hostname:"ci-4547-0-0-5-8fe0b910ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.543 [INFO][4416] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.543 [INFO][4416] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.543 [INFO][4416] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-8fe0b910ae' Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.555 [INFO][4416] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.560 [INFO][4416] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.567 [INFO][4416] ipam/ipam.go 511: Trying affinity for 192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.569 [INFO][4416] ipam/ipam.go 158: Attempting to load block cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.573 [INFO][4416] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.574 [INFO][4416] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.576 [INFO][4416] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552 Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.582 [INFO][4416] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.593 [INFO][4416] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.126.69/26] block=192.168.126.64/26 handle="k8s-pod-network.5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.593 [INFO][4416] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.126.69/26] handle="k8s-pod-network.5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.593 [INFO][4416] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:19:50.622967 containerd[1576]: 2025-12-16 12:19:50.593 [INFO][4416] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.126.69/26] IPv6=[] ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" HandleID="k8s-pod-network.5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" Dec 16 12:19:50.624473 containerd[1576]: 2025-12-16 12:19:50.595 [INFO][4405] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Namespace="kube-system" Pod="coredns-66bc5c9577-v6jwv" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2c875d65-0c1c-4f65-a11e-e47b73dda454", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"", Pod:"coredns-66bc5c9577-v6jwv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.126.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali14a9997d8f9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:50.624473 containerd[1576]: 2025-12-16 12:19:50.595 [INFO][4405] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.69/32] ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Namespace="kube-system" Pod="coredns-66bc5c9577-v6jwv" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" Dec 16 12:19:50.624473 containerd[1576]: 2025-12-16 12:19:50.596 [INFO][4405] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali14a9997d8f9 ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Namespace="kube-system" Pod="coredns-66bc5c9577-v6jwv" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" Dec 16 12:19:50.624473 containerd[1576]: 2025-12-16 12:19:50.601 [INFO][4405] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Namespace="kube-system" Pod="coredns-66bc5c9577-v6jwv" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" Dec 16 12:19:50.624473 containerd[1576]: 2025-12-16 12:19:50.601 [INFO][4405] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Namespace="kube-system" Pod="coredns-66bc5c9577-v6jwv" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2c875d65-0c1c-4f65-a11e-e47b73dda454", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552", Pod:"coredns-66bc5c9577-v6jwv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.126.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali14a9997d8f9", MAC:"f2:0e:ce:e7:74:e5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:50.624753 containerd[1576]: 2025-12-16 12:19:50.617 [INFO][4405] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" Namespace="kube-system" Pod="coredns-66bc5c9577-v6jwv" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-coredns--66bc5c9577--v6jwv-eth0" Dec 16 12:19:50.650878 containerd[1576]: time="2025-12-16T12:19:50.650262452Z" level=info msg="connecting to shim 5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552" address="unix:///run/containerd/s/cdcb3bf94a289acfeee105daa0f4ade079f1c32c45b3f19600ce90b076aeceae" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:50.692094 systemd[1]: Started cri-containerd-5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552.scope - libcontainer container 5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552. Dec 16 12:19:50.712522 kubelet[2821]: E1216 12:19:50.712400 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:19:50.720000 audit: BPF prog-id=198 op=LOAD Dec 16 12:19:50.723000 audit: BPF prog-id=199 op=LOAD Dec 16 12:19:50.723000 audit[4448]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=4436 pid=4448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562343336643762656364373433373561646330346530633834363637 Dec 16 12:19:50.723000 audit: BPF prog-id=199 op=UNLOAD Dec 16 12:19:50.723000 audit[4448]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4436 pid=4448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562343336643762656364373433373561646330346530633834363637 Dec 16 12:19:50.723000 audit: BPF prog-id=200 op=LOAD Dec 16 12:19:50.723000 audit[4448]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=4436 pid=4448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562343336643762656364373433373561646330346530633834363637 Dec 16 12:19:50.723000 audit: BPF prog-id=201 op=LOAD Dec 16 12:19:50.723000 audit[4448]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=4436 pid=4448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562343336643762656364373433373561646330346530633834363637 Dec 16 12:19:50.723000 audit: BPF prog-id=201 op=UNLOAD Dec 16 12:19:50.723000 audit[4448]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4436 pid=4448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562343336643762656364373433373561646330346530633834363637 Dec 16 12:19:50.723000 audit: BPF prog-id=200 op=UNLOAD Dec 16 12:19:50.723000 audit[4448]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4436 pid=4448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562343336643762656364373433373561646330346530633834363637 Dec 16 12:19:50.723000 audit: BPF prog-id=202 op=LOAD Dec 16 12:19:50.723000 audit[4448]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=4436 pid=4448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.723000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562343336643762656364373433373561646330346530633834363637 Dec 16 12:19:50.761033 containerd[1576]: time="2025-12-16T12:19:50.760994241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-v6jwv,Uid:2c875d65-0c1c-4f65-a11e-e47b73dda454,Namespace:kube-system,Attempt:0,} returns sandbox id \"5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552\"" Dec 16 12:19:50.765992 containerd[1576]: time="2025-12-16T12:19:50.765860816Z" level=info msg="CreateContainer within sandbox \"5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:19:50.780118 containerd[1576]: time="2025-12-16T12:19:50.779477834Z" level=info msg="Container 5da100b830946ccaecff4996566213cbf48149ed411f7032be8976799875b7ac: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:50.787794 containerd[1576]: time="2025-12-16T12:19:50.787749703Z" level=info msg="CreateContainer within sandbox \"5b436d7becd74375adc04e0c84667daab99cdc7beb1a14c0e5757eeaddb1e552\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5da100b830946ccaecff4996566213cbf48149ed411f7032be8976799875b7ac\"" Dec 16 12:19:50.789479 containerd[1576]: time="2025-12-16T12:19:50.789422669Z" level=info msg="StartContainer for \"5da100b830946ccaecff4996566213cbf48149ed411f7032be8976799875b7ac\"" Dec 16 12:19:50.792035 containerd[1576]: time="2025-12-16T12:19:50.791973380Z" level=info msg="connecting to shim 5da100b830946ccaecff4996566213cbf48149ed411f7032be8976799875b7ac" address="unix:///run/containerd/s/cdcb3bf94a289acfeee105daa0f4ade079f1c32c45b3f19600ce90b076aeceae" protocol=ttrpc version=3 Dec 16 12:19:50.817168 systemd[1]: Started cri-containerd-5da100b830946ccaecff4996566213cbf48149ed411f7032be8976799875b7ac.scope - libcontainer container 5da100b830946ccaecff4996566213cbf48149ed411f7032be8976799875b7ac. Dec 16 12:19:50.835000 audit: BPF prog-id=203 op=LOAD Dec 16 12:19:50.835000 audit: BPF prog-id=204 op=LOAD Dec 16 12:19:50.835000 audit[4473]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4436 pid=4473 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.835000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564613130306238333039343663636165636666343939363536363231 Dec 16 12:19:50.835000 audit: BPF prog-id=204 op=UNLOAD Dec 16 12:19:50.835000 audit[4473]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4436 pid=4473 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.835000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564613130306238333039343663636165636666343939363536363231 Dec 16 12:19:50.835000 audit: BPF prog-id=205 op=LOAD Dec 16 12:19:50.835000 audit[4473]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4436 pid=4473 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.835000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564613130306238333039343663636165636666343939363536363231 Dec 16 12:19:50.835000 audit: BPF prog-id=206 op=LOAD Dec 16 12:19:50.835000 audit[4473]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4436 pid=4473 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.835000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564613130306238333039343663636165636666343939363536363231 Dec 16 12:19:50.835000 audit: BPF prog-id=206 op=UNLOAD Dec 16 12:19:50.835000 audit[4473]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4436 pid=4473 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.835000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564613130306238333039343663636165636666343939363536363231 Dec 16 12:19:50.835000 audit: BPF prog-id=205 op=UNLOAD Dec 16 12:19:50.835000 audit[4473]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4436 pid=4473 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.835000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564613130306238333039343663636165636666343939363536363231 Dec 16 12:19:50.835000 audit: BPF prog-id=207 op=LOAD Dec 16 12:19:50.835000 audit[4473]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4436 pid=4473 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.835000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564613130306238333039343663636165636666343939363536363231 Dec 16 12:19:50.865659 containerd[1576]: time="2025-12-16T12:19:50.863739409Z" level=info msg="StartContainer for \"5da100b830946ccaecff4996566213cbf48149ed411f7032be8976799875b7ac\" returns successfully" Dec 16 12:19:50.924054 containerd[1576]: time="2025-12-16T12:19:50.923896076Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:50.926821 containerd[1576]: time="2025-12-16T12:19:50.926735395Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:19:50.927118 containerd[1576]: time="2025-12-16T12:19:50.926981922Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:50.928653 kubelet[2821]: E1216 12:19:50.927352 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:19:50.928653 kubelet[2821]: E1216 12:19:50.927440 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:19:50.928653 kubelet[2821]: E1216 12:19:50.927582 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d865fff8d-bxz6x_calico-system(bbfa367f-11d3-466c-9181-c6ee23836f5f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:50.929705 kubelet[2821]: E1216 12:19:50.927615 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:19:50.952000 audit[4505]: NETFILTER_CFG table=filter:121 family=2 entries=19 op=nft_register_rule pid=4505 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:50.952000 audit[4505]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffc62a98a0 a2=0 a3=1 items=0 ppid=2972 pid=4505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.952000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:50.961000 audit[4505]: NETFILTER_CFG table=nat:122 family=2 entries=33 op=nft_register_chain pid=4505 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:50.961000 audit[4505]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=13428 a0=3 a1=ffffc62a98a0 a2=0 a3=1 items=0 ppid=2972 pid=4505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:50.961000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:51.147911 systemd-networkd[1470]: calie27493fd194: Gained IPv6LL Dec 16 12:19:51.451507 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1383671386.mount: Deactivated successfully. Dec 16 12:19:51.720303 kubelet[2821]: E1216 12:19:51.720035 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:19:51.723265 kubelet[2821]: E1216 12:19:51.723134 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:19:51.741688 kubelet[2821]: I1216 12:19:51.741597 2821 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-v6jwv" podStartSLOduration=43.741578923 podStartE2EDuration="43.741578923s" podCreationTimestamp="2025-12-16 12:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:19:51.739670471 +0000 UTC m=+49.475240353" watchObservedRunningTime="2025-12-16 12:19:51.741578923 +0000 UTC m=+49.477148805" Dec 16 12:19:51.787776 systemd-networkd[1470]: cali79d5c99b2be: Gained IPv6LL Dec 16 12:19:51.980000 audit[4527]: NETFILTER_CFG table=filter:123 family=2 entries=16 op=nft_register_rule pid=4527 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:51.980000 audit[4527]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffea460150 a2=0 a3=1 items=0 ppid=2972 pid=4527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:51.980000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:51.993000 audit[4527]: NETFILTER_CFG table=nat:124 family=2 entries=54 op=nft_register_chain pid=4527 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:51.993000 audit[4527]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19092 a0=3 a1=ffffea460150 a2=0 a3=1 items=0 ppid=2972 pid=4527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:51.993000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:52.107932 systemd-networkd[1470]: cali14a9997d8f9: Gained IPv6LL Dec 16 12:19:52.455691 containerd[1576]: time="2025-12-16T12:19:52.454778038Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6nr2l,Uid:30279a80-ac32-4c4e-affe-8e2742945896,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:52.462180 containerd[1576]: time="2025-12-16T12:19:52.462140478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-85msp,Uid:6ad46257-20db-42d2-b357-93e753f0c2ca,Namespace:calico-system,Attempt:0,}" Dec 16 12:19:52.466905 containerd[1576]: time="2025-12-16T12:19:52.466264830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85ddfd7b99-lzr5b,Uid:4dd8654f-30cc-4aed-bf7b-e1600d664a65,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:19:52.819016 systemd-networkd[1470]: cali85329eceefd: Link UP Dec 16 12:19:52.820900 systemd-networkd[1470]: cali85329eceefd: Gained carrier Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.578 [INFO][4566] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.611 [INFO][4566] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0 calico-apiserver-85ddfd7b99- calico-apiserver 4dd8654f-30cc-4aed-bf7b-e1600d664a65 816 0 2025-12-16 12:19:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85ddfd7b99 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-5-8fe0b910ae calico-apiserver-85ddfd7b99-lzr5b eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali85329eceefd [] [] }} ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-lzr5b" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.612 [INFO][4566] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-lzr5b" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.670 [INFO][4590] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" HandleID="k8s-pod-network.aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.671 [INFO][4590] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" HandleID="k8s-pod-network.aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb5d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-5-8fe0b910ae", "pod":"calico-apiserver-85ddfd7b99-lzr5b", "timestamp":"2025-12-16 12:19:52.670559817 +0000 UTC"}, Hostname:"ci-4547-0-0-5-8fe0b910ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.671 [INFO][4590] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.672 [INFO][4590] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.672 [INFO][4590] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-8fe0b910ae' Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.687 [INFO][4590] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.704 [INFO][4590] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.726 [INFO][4590] ipam/ipam.go 511: Trying affinity for 192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.736 [INFO][4590] ipam/ipam.go 158: Attempting to load block cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.746 [INFO][4590] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.747 [INFO][4590] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.754 [INFO][4590] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42 Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.764 [INFO][4590] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.781 [INFO][4590] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.126.70/26] block=192.168.126.64/26 handle="k8s-pod-network.aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.781 [INFO][4590] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.126.70/26] handle="k8s-pod-network.aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.781 [INFO][4590] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:19:52.854953 containerd[1576]: 2025-12-16 12:19:52.781 [INFO][4590] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.126.70/26] IPv6=[] ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" HandleID="k8s-pod-network.aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" Dec 16 12:19:52.855704 containerd[1576]: 2025-12-16 12:19:52.804 [INFO][4566] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-lzr5b" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0", GenerateName:"calico-apiserver-85ddfd7b99-", Namespace:"calico-apiserver", SelfLink:"", UID:"4dd8654f-30cc-4aed-bf7b-e1600d664a65", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85ddfd7b99", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"", Pod:"calico-apiserver-85ddfd7b99-lzr5b", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.126.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali85329eceefd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:52.855704 containerd[1576]: 2025-12-16 12:19:52.808 [INFO][4566] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.70/32] ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-lzr5b" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" Dec 16 12:19:52.855704 containerd[1576]: 2025-12-16 12:19:52.808 [INFO][4566] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali85329eceefd ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-lzr5b" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" Dec 16 12:19:52.855704 containerd[1576]: 2025-12-16 12:19:52.823 [INFO][4566] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-lzr5b" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" Dec 16 12:19:52.855704 containerd[1576]: 2025-12-16 12:19:52.824 [INFO][4566] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-lzr5b" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0", GenerateName:"calico-apiserver-85ddfd7b99-", Namespace:"calico-apiserver", SelfLink:"", UID:"4dd8654f-30cc-4aed-bf7b-e1600d664a65", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85ddfd7b99", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42", Pod:"calico-apiserver-85ddfd7b99-lzr5b", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.126.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali85329eceefd", MAC:"8a:49:e1:af:19:30", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:52.855704 containerd[1576]: 2025-12-16 12:19:52.846 [INFO][4566] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" Namespace="calico-apiserver" Pod="calico-apiserver-85ddfd7b99-lzr5b" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-calico--apiserver--85ddfd7b99--lzr5b-eth0" Dec 16 12:19:52.906928 systemd-networkd[1470]: califc70e9e1e2f: Link UP Dec 16 12:19:52.907198 systemd-networkd[1470]: califc70e9e1e2f: Gained carrier Dec 16 12:19:52.918542 containerd[1576]: time="2025-12-16T12:19:52.918149500Z" level=info msg="connecting to shim aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42" address="unix:///run/containerd/s/e68e78cabbf171d926f2a3e45855d39815450988106ba85f73b8940a4fc5b9f0" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.550 [INFO][4554] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.600 [INFO][4554] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0 goldmane-7c778bb748- calico-system 6ad46257-20db-42d2-b357-93e753f0c2ca 812 0 2025-12-16 12:19:25 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-5-8fe0b910ae goldmane-7c778bb748-85msp eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] califc70e9e1e2f [] [] }} ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Namespace="calico-system" Pod="goldmane-7c778bb748-85msp" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.600 [INFO][4554] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Namespace="calico-system" Pod="goldmane-7c778bb748-85msp" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.723 [INFO][4588] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" HandleID="k8s-pod-network.3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.725 [INFO][4588] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" HandleID="k8s-pod-network.3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002aa4c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-5-8fe0b910ae", "pod":"goldmane-7c778bb748-85msp", "timestamp":"2025-12-16 12:19:52.723138765 +0000 UTC"}, Hostname:"ci-4547-0-0-5-8fe0b910ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.725 [INFO][4588] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.781 [INFO][4588] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.782 [INFO][4588] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-8fe0b910ae' Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.796 [INFO][4588] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.807 [INFO][4588] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.834 [INFO][4588] ipam/ipam.go 511: Trying affinity for 192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.839 [INFO][4588] ipam/ipam.go 158: Attempting to load block cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.848 [INFO][4588] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.848 [INFO][4588] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.854 [INFO][4588] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095 Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.863 [INFO][4588] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.880 [INFO][4588] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.126.71/26] block=192.168.126.64/26 handle="k8s-pod-network.3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.881 [INFO][4588] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.126.71/26] handle="k8s-pod-network.3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.881 [INFO][4588] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:19:52.954966 containerd[1576]: 2025-12-16 12:19:52.881 [INFO][4588] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.126.71/26] IPv6=[] ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" HandleID="k8s-pod-network.3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" Dec 16 12:19:52.955586 containerd[1576]: 2025-12-16 12:19:52.892 [INFO][4554] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Namespace="calico-system" Pod="goldmane-7c778bb748-85msp" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"6ad46257-20db-42d2-b357-93e753f0c2ca", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"", Pod:"goldmane-7c778bb748-85msp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.126.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califc70e9e1e2f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:52.955586 containerd[1576]: 2025-12-16 12:19:52.892 [INFO][4554] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.71/32] ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Namespace="calico-system" Pod="goldmane-7c778bb748-85msp" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" Dec 16 12:19:52.955586 containerd[1576]: 2025-12-16 12:19:52.892 [INFO][4554] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc70e9e1e2f ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Namespace="calico-system" Pod="goldmane-7c778bb748-85msp" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" Dec 16 12:19:52.955586 containerd[1576]: 2025-12-16 12:19:52.908 [INFO][4554] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Namespace="calico-system" Pod="goldmane-7c778bb748-85msp" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" Dec 16 12:19:52.955586 containerd[1576]: 2025-12-16 12:19:52.914 [INFO][4554] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Namespace="calico-system" Pod="goldmane-7c778bb748-85msp" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"6ad46257-20db-42d2-b357-93e753f0c2ca", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095", Pod:"goldmane-7c778bb748-85msp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.126.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califc70e9e1e2f", MAC:"16:cb:ee:8a:32:b1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:52.955586 containerd[1576]: 2025-12-16 12:19:52.945 [INFO][4554] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" Namespace="calico-system" Pod="goldmane-7c778bb748-85msp" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-goldmane--7c778bb748--85msp-eth0" Dec 16 12:19:52.974513 systemd[1]: Started cri-containerd-aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42.scope - libcontainer container aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42. Dec 16 12:19:53.003545 containerd[1576]: time="2025-12-16T12:19:53.003282650Z" level=info msg="connecting to shim 3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095" address="unix:///run/containerd/s/194a298fdb521ad2cf5e1e2f7a1f02e5baefbe5ad55daf4537975b3892e438d2" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:53.011147 systemd-networkd[1470]: calif0773f5ba7f: Link UP Dec 16 12:19:53.012958 systemd-networkd[1470]: calif0773f5ba7f: Gained carrier Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.604 [INFO][4542] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.646 [INFO][4542] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0 csi-node-driver- calico-system 30279a80-ac32-4c4e-affe-8e2742945896 719 0 2025-12-16 12:19:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-5-8fe0b910ae csi-node-driver-6nr2l eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif0773f5ba7f [] [] }} ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Namespace="calico-system" Pod="csi-node-driver-6nr2l" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.647 [INFO][4542] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Namespace="calico-system" Pod="csi-node-driver-6nr2l" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.758 [INFO][4600] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" HandleID="k8s-pod-network.4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.759 [INFO][4600] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" HandleID="k8s-pod-network.4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b010), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-5-8fe0b910ae", "pod":"csi-node-driver-6nr2l", "timestamp":"2025-12-16 12:19:52.758742332 +0000 UTC"}, Hostname:"ci-4547-0-0-5-8fe0b910ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.759 [INFO][4600] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.881 [INFO][4600] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.881 [INFO][4600] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-8fe0b910ae' Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.924 [INFO][4600] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.938 [INFO][4600] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.960 [INFO][4600] ipam/ipam.go 511: Trying affinity for 192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.967 [INFO][4600] ipam/ipam.go 158: Attempting to load block cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.974 [INFO][4600] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.126.64/26 host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.974 [INFO][4600] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.126.64/26 handle="k8s-pod-network.4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.978 [INFO][4600] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.985 [INFO][4600] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.126.64/26 handle="k8s-pod-network.4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.997 [INFO][4600] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.126.72/26] block=192.168.126.64/26 handle="k8s-pod-network.4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.997 [INFO][4600] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.126.72/26] handle="k8s-pod-network.4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" host="ci-4547-0-0-5-8fe0b910ae" Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.997 [INFO][4600] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:19:53.047634 containerd[1576]: 2025-12-16 12:19:52.997 [INFO][4600] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.126.72/26] IPv6=[] ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" HandleID="k8s-pod-network.4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Workload="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" Dec 16 12:19:53.048215 containerd[1576]: 2025-12-16 12:19:53.001 [INFO][4542] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Namespace="calico-system" Pod="csi-node-driver-6nr2l" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"30279a80-ac32-4c4e-affe-8e2742945896", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"", Pod:"csi-node-driver-6nr2l", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.126.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif0773f5ba7f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:53.048215 containerd[1576]: 2025-12-16 12:19:53.003 [INFO][4542] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.126.72/32] ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Namespace="calico-system" Pod="csi-node-driver-6nr2l" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" Dec 16 12:19:53.048215 containerd[1576]: 2025-12-16 12:19:53.003 [INFO][4542] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif0773f5ba7f ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Namespace="calico-system" Pod="csi-node-driver-6nr2l" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" Dec 16 12:19:53.048215 containerd[1576]: 2025-12-16 12:19:53.014 [INFO][4542] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Namespace="calico-system" Pod="csi-node-driver-6nr2l" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" Dec 16 12:19:53.048215 containerd[1576]: 2025-12-16 12:19:53.014 [INFO][4542] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Namespace="calico-system" Pod="csi-node-driver-6nr2l" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"30279a80-ac32-4c4e-affe-8e2742945896", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 19, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-8fe0b910ae", ContainerID:"4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a", Pod:"csi-node-driver-6nr2l", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.126.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif0773f5ba7f", MAC:"de:5f:2d:14:48:47", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:19:53.048215 containerd[1576]: 2025-12-16 12:19:53.040 [INFO][4542] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" Namespace="calico-system" Pod="csi-node-driver-6nr2l" WorkloadEndpoint="ci--4547--0--0--5--8fe0b910ae-k8s-csi--node--driver--6nr2l-eth0" Dec 16 12:19:53.054893 systemd[1]: Started cri-containerd-3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095.scope - libcontainer container 3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095. Dec 16 12:19:53.078000 audit: BPF prog-id=208 op=LOAD Dec 16 12:19:53.080000 audit: BPF prog-id=209 op=LOAD Dec 16 12:19:53.080000 audit[4641]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4630 pid=4641 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161663863333035363865663163393939393364343562626436623438 Dec 16 12:19:53.080000 audit: BPF prog-id=209 op=UNLOAD Dec 16 12:19:53.080000 audit[4641]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4630 pid=4641 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161663863333035363865663163393939393364343562626436623438 Dec 16 12:19:53.082000 audit: BPF prog-id=210 op=LOAD Dec 16 12:19:53.082000 audit[4641]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4630 pid=4641 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.082000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161663863333035363865663163393939393364343562626436623438 Dec 16 12:19:53.083000 audit: BPF prog-id=211 op=LOAD Dec 16 12:19:53.083000 audit[4641]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4630 pid=4641 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161663863333035363865663163393939393364343562626436623438 Dec 16 12:19:53.084000 audit: BPF prog-id=211 op=UNLOAD Dec 16 12:19:53.084000 audit[4641]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4630 pid=4641 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161663863333035363865663163393939393364343562626436623438 Dec 16 12:19:53.084000 audit: BPF prog-id=210 op=UNLOAD Dec 16 12:19:53.084000 audit[4641]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4630 pid=4641 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161663863333035363865663163393939393364343562626436623438 Dec 16 12:19:53.085000 audit: BPF prog-id=212 op=LOAD Dec 16 12:19:53.085000 audit[4641]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4630 pid=4641 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.085000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161663863333035363865663163393939393364343562626436623438 Dec 16 12:19:53.089776 containerd[1576]: time="2025-12-16T12:19:53.089062838Z" level=info msg="connecting to shim 4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a" address="unix:///run/containerd/s/f1f46caab2c0b6d8fd6fcd40d6d0db0246f6a4208f033cc81c676d74a090b640" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:19:53.111000 audit: BPF prog-id=213 op=LOAD Dec 16 12:19:53.112000 audit: BPF prog-id=214 op=LOAD Dec 16 12:19:53.112000 audit[4686]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4672 pid=4686 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.112000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364663666663232336339333066303164326262663164363634336561 Dec 16 12:19:53.114000 audit: BPF prog-id=214 op=UNLOAD Dec 16 12:19:53.114000 audit[4686]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4672 pid=4686 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.114000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364663666663232336339333066303164326262663164363634336561 Dec 16 12:19:53.117000 audit: BPF prog-id=215 op=LOAD Dec 16 12:19:53.117000 audit[4686]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4672 pid=4686 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364663666663232336339333066303164326262663164363634336561 Dec 16 12:19:53.117000 audit: BPF prog-id=216 op=LOAD Dec 16 12:19:53.117000 audit[4686]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4672 pid=4686 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364663666663232336339333066303164326262663164363634336561 Dec 16 12:19:53.117000 audit: BPF prog-id=216 op=UNLOAD Dec 16 12:19:53.117000 audit[4686]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4672 pid=4686 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364663666663232336339333066303164326262663164363634336561 Dec 16 12:19:53.117000 audit: BPF prog-id=215 op=UNLOAD Dec 16 12:19:53.117000 audit[4686]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4672 pid=4686 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364663666663232336339333066303164326262663164363634336561 Dec 16 12:19:53.117000 audit: BPF prog-id=217 op=LOAD Dec 16 12:19:53.117000 audit[4686]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4672 pid=4686 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3364663666663232336339333066303164326262663164363634336561 Dec 16 12:19:53.124094 systemd[1]: Started cri-containerd-4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a.scope - libcontainer container 4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a. Dec 16 12:19:53.153000 audit: BPF prog-id=218 op=LOAD Dec 16 12:19:53.154000 audit: BPF prog-id=219 op=LOAD Dec 16 12:19:53.154000 audit[4730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431383264343264393932643065623130393866353836626566623763 Dec 16 12:19:53.154000 audit: BPF prog-id=219 op=UNLOAD Dec 16 12:19:53.154000 audit[4730]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431383264343264393932643065623130393866353836626566623763 Dec 16 12:19:53.155000 audit: BPF prog-id=220 op=LOAD Dec 16 12:19:53.155000 audit[4730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.155000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431383264343264393932643065623130393866353836626566623763 Dec 16 12:19:53.155000 audit: BPF prog-id=221 op=LOAD Dec 16 12:19:53.155000 audit[4730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.155000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431383264343264393932643065623130393866353836626566623763 Dec 16 12:19:53.156000 audit: BPF prog-id=221 op=UNLOAD Dec 16 12:19:53.156000 audit[4730]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431383264343264393932643065623130393866353836626566623763 Dec 16 12:19:53.156000 audit: BPF prog-id=220 op=UNLOAD Dec 16 12:19:53.156000 audit[4730]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431383264343264393932643065623130393866353836626566623763 Dec 16 12:19:53.156000 audit: BPF prog-id=222 op=LOAD Dec 16 12:19:53.156000 audit[4730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431383264343264393932643065623130393866353836626566623763 Dec 16 12:19:53.201329 containerd[1576]: time="2025-12-16T12:19:53.201288816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85ddfd7b99-lzr5b,Uid:4dd8654f-30cc-4aed-bf7b-e1600d664a65,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"aaf8c30568ef1c99993d45bbd6b48f182dc38306870532188deb7e61cf8c4a42\"" Dec 16 12:19:53.204375 containerd[1576]: time="2025-12-16T12:19:53.204331938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6nr2l,Uid:30279a80-ac32-4c4e-affe-8e2742945896,Namespace:calico-system,Attempt:0,} returns sandbox id \"4182d42d992d0eb1098f586befb7ca29930eec43f1ab4a985df8685e6fe1fc5a\"" Dec 16 12:19:53.206035 containerd[1576]: time="2025-12-16T12:19:53.206001023Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:19:53.219759 containerd[1576]: time="2025-12-16T12:19:53.219705271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-85msp,Uid:6ad46257-20db-42d2-b357-93e753f0c2ca,Namespace:calico-system,Attempt:0,} returns sandbox id \"3df6ff223c930f01d2bbf1d6643ea3eb4b5711308a112308c4aeef1334de6095\"" Dec 16 12:19:53.264239 kubelet[2821]: I1216 12:19:53.264201 2821 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:19:53.552483 containerd[1576]: time="2025-12-16T12:19:53.552398619Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:53.554810 containerd[1576]: time="2025-12-16T12:19:53.554651760Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:19:53.554810 containerd[1576]: time="2025-12-16T12:19:53.554754643Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:53.555135 kubelet[2821]: E1216 12:19:53.555101 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:53.555284 kubelet[2821]: E1216 12:19:53.555198 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:53.555495 kubelet[2821]: E1216 12:19:53.555445 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-lzr5b_calico-apiserver(4dd8654f-30cc-4aed-bf7b-e1600d664a65): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:53.555685 kubelet[2821]: E1216 12:19:53.555530 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:19:53.556432 containerd[1576]: time="2025-12-16T12:19:53.556377006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:19:53.647864 kubelet[2821]: I1216 12:19:53.647766 2821 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:19:53.696000 audit[4824]: NETFILTER_CFG table=filter:125 family=2 entries=15 op=nft_register_rule pid=4824 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:53.699295 kernel: kauditd_printk_skb: 206 callbacks suppressed Dec 16 12:19:53.699451 kernel: audit: type=1325 audit(1765887593.696:656): table=filter:125 family=2 entries=15 op=nft_register_rule pid=4824 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:53.696000 audit[4824]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff2e032d0 a2=0 a3=1 items=0 ppid=2972 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.702435 kernel: audit: type=1300 audit(1765887593.696:656): arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff2e032d0 a2=0 a3=1 items=0 ppid=2972 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.702865 kernel: audit: type=1327 audit(1765887593.696:656): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:53.696000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:53.703000 audit[4824]: NETFILTER_CFG table=nat:126 family=2 entries=25 op=nft_register_chain pid=4824 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:53.705865 kernel: audit: type=1325 audit(1765887593.703:657): table=nat:126 family=2 entries=25 op=nft_register_chain pid=4824 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:53.705932 kernel: audit: type=1300 audit(1765887593.703:657): arch=c00000b7 syscall=211 success=yes exit=8580 a0=3 a1=fffff2e032d0 a2=0 a3=1 items=0 ppid=2972 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.703000 audit[4824]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8580 a0=3 a1=fffff2e032d0 a2=0 a3=1 items=0 ppid=2972 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:53.703000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:53.709289 kernel: audit: type=1327 audit(1765887593.703:657): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:53.746468 kubelet[2821]: E1216 12:19:53.746393 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:19:53.910988 containerd[1576]: time="2025-12-16T12:19:53.910931662Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:53.912838 containerd[1576]: time="2025-12-16T12:19:53.912672749Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:19:53.912838 containerd[1576]: time="2025-12-16T12:19:53.912781832Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:53.913814 kubelet[2821]: E1216 12:19:53.913760 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:19:53.914116 kubelet[2821]: E1216 12:19:53.913984 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:19:53.915530 kubelet[2821]: E1216 12:19:53.914353 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:53.915608 containerd[1576]: time="2025-12-16T12:19:53.914597001Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:19:54.271215 containerd[1576]: time="2025-12-16T12:19:54.271071724Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:54.272512 containerd[1576]: time="2025-12-16T12:19:54.272437440Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:19:54.272752 containerd[1576]: time="2025-12-16T12:19:54.272550443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:54.273121 kubelet[2821]: E1216 12:19:54.272923 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:19:54.273121 kubelet[2821]: E1216 12:19:54.272973 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:19:54.273470 kubelet[2821]: E1216 12:19:54.273169 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-85msp_calico-system(6ad46257-20db-42d2-b357-93e753f0c2ca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:54.273983 kubelet[2821]: E1216 12:19:54.273836 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:19:54.274157 containerd[1576]: time="2025-12-16T12:19:54.274080564Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:19:54.290000 audit: BPF prog-id=223 op=LOAD Dec 16 12:19:54.290000 audit[4869]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb5d19c8 a2=98 a3=ffffeb5d19b8 items=0 ppid=4853 pid=4869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.297585 kernel: audit: type=1334 audit(1765887594.290:658): prog-id=223 op=LOAD Dec 16 12:19:54.297710 kernel: audit: type=1300 audit(1765887594.290:658): arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb5d19c8 a2=98 a3=ffffeb5d19b8 items=0 ppid=4853 pid=4869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.297732 kernel: audit: type=1327 audit(1765887594.290:658): proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:19:54.290000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:19:54.290000 audit: BPF prog-id=223 op=UNLOAD Dec 16 12:19:54.305685 kernel: audit: type=1334 audit(1765887594.290:659): prog-id=223 op=UNLOAD Dec 16 12:19:54.290000 audit[4869]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffeb5d1998 a3=0 items=0 ppid=4853 pid=4869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.290000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:19:54.290000 audit: BPF prog-id=224 op=LOAD Dec 16 12:19:54.290000 audit[4869]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb5d1878 a2=74 a3=95 items=0 ppid=4853 pid=4869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.290000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:19:54.291000 audit: BPF prog-id=224 op=UNLOAD Dec 16 12:19:54.291000 audit[4869]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4853 pid=4869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.291000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:19:54.291000 audit: BPF prog-id=225 op=LOAD Dec 16 12:19:54.291000 audit[4869]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb5d18a8 a2=40 a3=ffffeb5d18d8 items=0 ppid=4853 pid=4869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.291000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:19:54.291000 audit: BPF prog-id=225 op=UNLOAD Dec 16 12:19:54.291000 audit[4869]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffeb5d18d8 items=0 ppid=4853 pid=4869 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.291000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:19:54.299000 audit: BPF prog-id=226 op=LOAD Dec 16 12:19:54.299000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc9e4e318 a2=98 a3=ffffc9e4e308 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.299000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.299000 audit: BPF prog-id=226 op=UNLOAD Dec 16 12:19:54.299000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc9e4e2e8 a3=0 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.299000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.299000 audit: BPF prog-id=227 op=LOAD Dec 16 12:19:54.299000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc9e4dfa8 a2=74 a3=95 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.299000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.299000 audit: BPF prog-id=227 op=UNLOAD Dec 16 12:19:54.299000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.299000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.299000 audit: BPF prog-id=228 op=LOAD Dec 16 12:19:54.299000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc9e4e008 a2=94 a3=2 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.299000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.299000 audit: BPF prog-id=228 op=UNLOAD Dec 16 12:19:54.299000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.299000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.419000 audit: BPF prog-id=229 op=LOAD Dec 16 12:19:54.419000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc9e4dfc8 a2=40 a3=ffffc9e4dff8 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.419000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.419000 audit: BPF prog-id=229 op=UNLOAD Dec 16 12:19:54.419000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffc9e4dff8 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.419000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.431000 audit: BPF prog-id=230 op=LOAD Dec 16 12:19:54.431000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc9e4dfd8 a2=94 a3=4 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.431000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.431000 audit: BPF prog-id=230 op=UNLOAD Dec 16 12:19:54.431000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.431000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.432000 audit: BPF prog-id=231 op=LOAD Dec 16 12:19:54.432000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc9e4de18 a2=94 a3=5 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.432000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.432000 audit: BPF prog-id=231 op=UNLOAD Dec 16 12:19:54.432000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.432000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.432000 audit: BPF prog-id=232 op=LOAD Dec 16 12:19:54.432000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc9e4e048 a2=94 a3=6 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.432000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.432000 audit: BPF prog-id=232 op=UNLOAD Dec 16 12:19:54.432000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.432000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.432000 audit: BPF prog-id=233 op=LOAD Dec 16 12:19:54.432000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc9e4d818 a2=94 a3=83 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.432000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.432000 audit: BPF prog-id=234 op=LOAD Dec 16 12:19:54.432000 audit[4870]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffc9e4d5d8 a2=94 a3=2 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.432000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.432000 audit: BPF prog-id=234 op=UNLOAD Dec 16 12:19:54.432000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.432000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.434000 audit: BPF prog-id=233 op=UNLOAD Dec 16 12:19:54.434000 audit[4870]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=5628620 a3=561bb00 items=0 ppid=4853 pid=4870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.434000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:19:54.456000 audit: BPF prog-id=235 op=LOAD Dec 16 12:19:54.456000 audit[4880]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff13b5a18 a2=98 a3=fffff13b5a08 items=0 ppid=4853 pid=4880 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.456000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:19:54.456000 audit: BPF prog-id=235 op=UNLOAD Dec 16 12:19:54.456000 audit[4880]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff13b59e8 a3=0 items=0 ppid=4853 pid=4880 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.456000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:19:54.456000 audit: BPF prog-id=236 op=LOAD Dec 16 12:19:54.456000 audit[4880]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff13b58c8 a2=74 a3=95 items=0 ppid=4853 pid=4880 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.456000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:19:54.456000 audit: BPF prog-id=236 op=UNLOAD Dec 16 12:19:54.456000 audit[4880]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4853 pid=4880 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.456000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:19:54.456000 audit: BPF prog-id=237 op=LOAD Dec 16 12:19:54.456000 audit[4880]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff13b58f8 a2=40 a3=fffff13b5928 items=0 ppid=4853 pid=4880 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.456000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:19:54.457000 audit: BPF prog-id=237 op=UNLOAD Dec 16 12:19:54.457000 audit[4880]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=fffff13b5928 items=0 ppid=4853 pid=4880 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.457000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:19:54.540856 systemd-networkd[1470]: cali85329eceefd: Gained IPv6LL Dec 16 12:19:54.578837 systemd-networkd[1470]: vxlan.calico: Link UP Dec 16 12:19:54.578845 systemd-networkd[1470]: vxlan.calico: Gained carrier Dec 16 12:19:54.616000 audit: BPF prog-id=238 op=LOAD Dec 16 12:19:54.616000 audit[4913]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0e5d448 a2=98 a3=ffffd0e5d438 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.616000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.616000 audit: BPF prog-id=238 op=UNLOAD Dec 16 12:19:54.616000 audit[4913]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd0e5d418 a3=0 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.616000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.617000 audit: BPF prog-id=239 op=LOAD Dec 16 12:19:54.617000 audit[4913]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0e5d128 a2=74 a3=95 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.617000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.617000 audit: BPF prog-id=239 op=UNLOAD Dec 16 12:19:54.617000 audit[4913]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.617000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.617000 audit: BPF prog-id=240 op=LOAD Dec 16 12:19:54.617000 audit[4913]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0e5d188 a2=94 a3=2 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.617000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.617000 audit: BPF prog-id=240 op=UNLOAD Dec 16 12:19:54.617000 audit[4913]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.617000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.617000 audit: BPF prog-id=241 op=LOAD Dec 16 12:19:54.617000 audit[4913]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0e5d008 a2=40 a3=ffffd0e5d038 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.617000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.617000 audit: BPF prog-id=241 op=UNLOAD Dec 16 12:19:54.617000 audit[4913]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffd0e5d038 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.617000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.617000 audit: BPF prog-id=242 op=LOAD Dec 16 12:19:54.617000 audit[4913]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0e5d158 a2=94 a3=b7 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.617000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.617000 audit: BPF prog-id=242 op=UNLOAD Dec 16 12:19:54.617000 audit[4913]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.617000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.619000 audit: BPF prog-id=243 op=LOAD Dec 16 12:19:54.619000 audit[4913]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0e5c808 a2=94 a3=2 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.619000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.619000 audit: BPF prog-id=243 op=UNLOAD Dec 16 12:19:54.619000 audit[4913]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.619000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.619000 audit: BPF prog-id=244 op=LOAD Dec 16 12:19:54.619000 audit[4913]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0e5c998 a2=94 a3=30 items=0 ppid=4853 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.619000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:19:54.628000 audit: BPF prog-id=245 op=LOAD Dec 16 12:19:54.628000 audit[4918]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdc389bc8 a2=98 a3=ffffdc389bb8 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.628000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.628000 audit: BPF prog-id=245 op=UNLOAD Dec 16 12:19:54.628000 audit[4918]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffdc389b98 a3=0 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.628000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.628000 audit: BPF prog-id=246 op=LOAD Dec 16 12:19:54.628000 audit[4918]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffdc389858 a2=74 a3=95 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.628000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.628000 audit: BPF prog-id=246 op=UNLOAD Dec 16 12:19:54.628000 audit[4918]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.628000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.628000 audit: BPF prog-id=247 op=LOAD Dec 16 12:19:54.628000 audit[4918]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffdc3898b8 a2=94 a3=2 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.628000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.631000 audit: BPF prog-id=247 op=UNLOAD Dec 16 12:19:54.631000 audit[4918]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.631000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.638069 containerd[1576]: time="2025-12-16T12:19:54.635885568Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:54.646506 containerd[1576]: time="2025-12-16T12:19:54.646342047Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:19:54.646506 containerd[1576]: time="2025-12-16T12:19:54.646388968Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:54.647327 kubelet[2821]: E1216 12:19:54.647093 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:19:54.647327 kubelet[2821]: E1216 12:19:54.647144 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:19:54.647327 kubelet[2821]: E1216 12:19:54.647234 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:54.647327 kubelet[2821]: E1216 12:19:54.647274 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:19:54.718000 audit[4920]: NETFILTER_CFG table=filter:127 family=2 entries=14 op=nft_register_rule pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:54.718000 audit[4920]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffdce471a0 a2=0 a3=1 items=0 ppid=2972 pid=4920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.718000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:54.722000 audit[4920]: NETFILTER_CFG table=nat:128 family=2 entries=20 op=nft_register_rule pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:54.722000 audit[4920]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffdce471a0 a2=0 a3=1 items=0 ppid=2972 pid=4920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.722000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:54.749868 kubelet[2821]: E1216 12:19:54.749258 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:19:54.750772 kubelet[2821]: E1216 12:19:54.750006 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:19:54.752092 kubelet[2821]: E1216 12:19:54.752021 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:19:54.774000 audit: BPF prog-id=248 op=LOAD Dec 16 12:19:54.774000 audit[4918]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffdc389878 a2=40 a3=ffffdc3898a8 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.774000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.774000 audit: BPF prog-id=248 op=UNLOAD Dec 16 12:19:54.774000 audit[4918]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffdc3898a8 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.774000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.796793 systemd-networkd[1470]: califc70e9e1e2f: Gained IPv6LL Dec 16 12:19:54.819000 audit: BPF prog-id=249 op=LOAD Dec 16 12:19:54.819000 audit[4918]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffdc389888 a2=94 a3=4 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.819000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.819000 audit: BPF prog-id=249 op=UNLOAD Dec 16 12:19:54.819000 audit[4918]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.819000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.820000 audit: BPF prog-id=250 op=LOAD Dec 16 12:19:54.820000 audit[4918]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffdc3896c8 a2=94 a3=5 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.820000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.820000 audit: BPF prog-id=250 op=UNLOAD Dec 16 12:19:54.820000 audit[4918]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.820000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.820000 audit: BPF prog-id=251 op=LOAD Dec 16 12:19:54.820000 audit[4918]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffdc3898f8 a2=94 a3=6 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.820000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.820000 audit: BPF prog-id=251 op=UNLOAD Dec 16 12:19:54.820000 audit[4918]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.820000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.820000 audit: BPF prog-id=252 op=LOAD Dec 16 12:19:54.820000 audit[4918]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffdc3890c8 a2=94 a3=83 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.820000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.821000 audit: BPF prog-id=253 op=LOAD Dec 16 12:19:54.821000 audit[4918]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffdc388e88 a2=94 a3=2 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.821000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.821000 audit: BPF prog-id=253 op=UNLOAD Dec 16 12:19:54.821000 audit[4918]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.821000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.821000 audit: BPF prog-id=252 op=UNLOAD Dec 16 12:19:54.821000 audit[4918]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=12e4f620 a3=12e42b00 items=0 ppid=4853 pid=4918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.821000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:19:54.830000 audit: BPF prog-id=244 op=UNLOAD Dec 16 12:19:54.830000 audit[4853]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=400080e140 a2=0 a3=0 items=0 ppid=3893 pid=4853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.830000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Dec 16 12:19:54.971000 audit[4949]: NETFILTER_CFG table=mangle:129 family=2 entries=16 op=nft_register_chain pid=4949 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:19:54.971000 audit[4949]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffce97e4d0 a2=0 a3=ffffb86eafa8 items=0 ppid=4853 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.971000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:19:54.994000 audit[4952]: NETFILTER_CFG table=nat:130 family=2 entries=15 op=nft_register_chain pid=4952 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:19:54.994000 audit[4952]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffdf709bf0 a2=0 a3=ffffb13c4fa8 items=0 ppid=4853 pid=4952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.994000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:19:54.995000 audit[4950]: NETFILTER_CFG table=raw:131 family=2 entries=21 op=nft_register_chain pid=4950 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:19:54.995000 audit[4950]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffce12baf0 a2=0 a3=ffffb1554fa8 items=0 ppid=4853 pid=4950 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.995000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:19:55.001000 audit[4951]: NETFILTER_CFG table=filter:132 family=2 entries=327 op=nft_register_chain pid=4951 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:19:55.001000 audit[4951]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=193468 a0=3 a1=ffffef2f0250 a2=0 a3=ffffae129fa8 items=0 ppid=4853 pid=4951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:55.001000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:19:55.052120 systemd-networkd[1470]: calif0773f5ba7f: Gained IPv6LL Dec 16 12:19:55.489807 sshd[4082]: Connection closed by 138.68.91.238 port 40782 [preauth] Dec 16 12:19:55.492535 systemd[1]: sshd@7-46.224.130.63:22-138.68.91.238:40782.service: Deactivated successfully. Dec 16 12:19:55.493000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-46.224.130.63:22-138.68.91.238:40782 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:55.628150 systemd-networkd[1470]: vxlan.calico: Gained IPv6LL Dec 16 12:19:55.741000 audit[4967]: NETFILTER_CFG table=filter:133 family=2 entries=14 op=nft_register_rule pid=4967 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:55.741000 audit[4967]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc839cca0 a2=0 a3=1 items=0 ppid=2972 pid=4967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:55.741000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:55.750000 audit[4967]: NETFILTER_CFG table=nat:134 family=2 entries=20 op=nft_register_rule pid=4967 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:19:55.750000 audit[4967]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc839cca0 a2=0 a3=1 items=0 ppid=2972 pid=4967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:55.750000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:19:56.443642 containerd[1576]: time="2025-12-16T12:19:56.443425531Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:19:56.879199 containerd[1576]: time="2025-12-16T12:19:56.878235812Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:56.880526 containerd[1576]: time="2025-12-16T12:19:56.880327187Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:19:56.880526 containerd[1576]: time="2025-12-16T12:19:56.880376148Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:56.880766 kubelet[2821]: E1216 12:19:56.880720 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:19:56.882512 kubelet[2821]: E1216 12:19:56.880786 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:19:56.882512 kubelet[2821]: E1216 12:19:56.881107 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:56.883272 containerd[1576]: time="2025-12-16T12:19:56.883008417Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:19:57.223173 containerd[1576]: time="2025-12-16T12:19:57.223044534Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:57.224589 containerd[1576]: time="2025-12-16T12:19:57.224527726Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:19:57.224874 containerd[1576]: time="2025-12-16T12:19:57.224647888Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:57.225104 kubelet[2821]: E1216 12:19:57.224903 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:19:57.225104 kubelet[2821]: E1216 12:19:57.224975 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:19:57.225104 kubelet[2821]: E1216 12:19:57.225076 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:57.225312 kubelet[2821]: E1216 12:19:57.225139 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:20:06.444343 containerd[1576]: time="2025-12-16T12:20:06.444103094Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:20:06.834759 containerd[1576]: time="2025-12-16T12:20:06.834487169Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:06.836013 containerd[1576]: time="2025-12-16T12:20:06.835955113Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:20:06.836149 containerd[1576]: time="2025-12-16T12:20:06.836066792Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:06.836385 kubelet[2821]: E1216 12:20:06.836305 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:20:06.836385 kubelet[2821]: E1216 12:20:06.836366 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:20:06.836826 kubelet[2821]: E1216 12:20:06.836555 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-s9l56_calico-apiserver(b4c4df94-c3ee-426c-b06d-ed9edc99469b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:06.836826 kubelet[2821]: E1216 12:20:06.836592 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:20:06.838716 containerd[1576]: time="2025-12-16T12:20:06.838562405Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:20:07.218471 containerd[1576]: time="2025-12-16T12:20:07.218282155Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:07.220104 containerd[1576]: time="2025-12-16T12:20:07.220004178Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:20:07.220767 containerd[1576]: time="2025-12-16T12:20:07.220109937Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:07.220999 kubelet[2821]: E1216 12:20:07.220938 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:20:07.220999 kubelet[2821]: E1216 12:20:07.221000 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:20:07.221422 kubelet[2821]: E1216 12:20:07.221099 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d865fff8d-bxz6x_calico-system(bbfa367f-11d3-466c-9181-c6ee23836f5f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:07.221422 kubelet[2821]: E1216 12:20:07.221132 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:20:07.443040 containerd[1576]: time="2025-12-16T12:20:07.442964398Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:20:07.786668 containerd[1576]: time="2025-12-16T12:20:07.786444577Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:07.788983 containerd[1576]: time="2025-12-16T12:20:07.788929072Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:20:07.789660 containerd[1576]: time="2025-12-16T12:20:07.789022752Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:07.789788 kubelet[2821]: E1216 12:20:07.789591 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:20:07.789788 kubelet[2821]: E1216 12:20:07.789709 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:20:07.789870 kubelet[2821]: E1216 12:20:07.789821 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-85msp_calico-system(6ad46257-20db-42d2-b357-93e753f0c2ca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:07.789972 kubelet[2821]: E1216 12:20:07.789862 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:20:09.442476 containerd[1576]: time="2025-12-16T12:20:09.442179861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:20:09.786282 containerd[1576]: time="2025-12-16T12:20:09.786001476Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:09.787602 containerd[1576]: time="2025-12-16T12:20:09.787539663Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:20:09.787725 containerd[1576]: time="2025-12-16T12:20:09.787683742Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:09.787999 kubelet[2821]: E1216 12:20:09.787927 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:20:09.788538 kubelet[2821]: E1216 12:20:09.787997 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:20:09.788538 kubelet[2821]: E1216 12:20:09.788264 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-lzr5b_calico-apiserver(4dd8654f-30cc-4aed-bf7b-e1600d664a65): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:09.788538 kubelet[2821]: E1216 12:20:09.788305 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:20:09.789876 containerd[1576]: time="2025-12-16T12:20:09.789367288Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:20:10.150050 containerd[1576]: time="2025-12-16T12:20:10.149955089Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:10.151614 containerd[1576]: time="2025-12-16T12:20:10.151554197Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:20:10.151860 containerd[1576]: time="2025-12-16T12:20:10.151584917Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:10.151937 kubelet[2821]: E1216 12:20:10.151869 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:20:10.151937 kubelet[2821]: E1216 12:20:10.151925 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:20:10.152456 kubelet[2821]: E1216 12:20:10.152015 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:10.153178 containerd[1576]: time="2025-12-16T12:20:10.153141906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:20:10.494302 containerd[1576]: time="2025-12-16T12:20:10.494103667Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:10.496355 containerd[1576]: time="2025-12-16T12:20:10.496270811Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:20:10.496715 containerd[1576]: time="2025-12-16T12:20:10.496283211Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:10.497562 kubelet[2821]: E1216 12:20:10.496682 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:20:10.497562 kubelet[2821]: E1216 12:20:10.496772 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:20:10.497562 kubelet[2821]: E1216 12:20:10.496861 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:10.497562 kubelet[2821]: E1216 12:20:10.496906 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:20:11.443322 kubelet[2821]: E1216 12:20:11.443249 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:20:17.443524 kubelet[2821]: E1216 12:20:17.443328 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:20:22.445148 kubelet[2821]: E1216 12:20:22.444923 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:20:22.447004 kubelet[2821]: E1216 12:20:22.445248 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:20:24.444115 kubelet[2821]: E1216 12:20:24.443991 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:20:24.445392 kubelet[2821]: E1216 12:20:24.445345 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:20:27.441892 containerd[1576]: time="2025-12-16T12:20:27.441840104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:20:27.802644 containerd[1576]: time="2025-12-16T12:20:27.801990350Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:27.804042 containerd[1576]: time="2025-12-16T12:20:27.803840756Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:20:27.804419 containerd[1576]: time="2025-12-16T12:20:27.803884836Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:27.804851 kubelet[2821]: E1216 12:20:27.804698 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:20:27.806159 kubelet[2821]: E1216 12:20:27.804789 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:20:27.806159 kubelet[2821]: E1216 12:20:27.805744 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:27.807852 containerd[1576]: time="2025-12-16T12:20:27.807804449Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:20:28.185646 containerd[1576]: time="2025-12-16T12:20:28.185455680Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:28.188166 containerd[1576]: time="2025-12-16T12:20:28.187974450Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:20:28.188166 containerd[1576]: time="2025-12-16T12:20:28.188041850Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:28.189858 kubelet[2821]: E1216 12:20:28.189812 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:20:28.190127 kubelet[2821]: E1216 12:20:28.189988 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:20:28.190711 kubelet[2821]: E1216 12:20:28.190250 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:28.190711 kubelet[2821]: E1216 12:20:28.190294 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:20:29.440802 containerd[1576]: time="2025-12-16T12:20:29.440756910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:20:29.780353 containerd[1576]: time="2025-12-16T12:20:29.779808558Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:29.781969 containerd[1576]: time="2025-12-16T12:20:29.781922327Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:20:29.782281 containerd[1576]: time="2025-12-16T12:20:29.781987727Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:29.783838 kubelet[2821]: E1216 12:20:29.783787 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:20:29.784172 kubelet[2821]: E1216 12:20:29.783841 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:20:29.784172 kubelet[2821]: E1216 12:20:29.783913 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d865fff8d-bxz6x_calico-system(bbfa367f-11d3-466c-9181-c6ee23836f5f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:29.784172 kubelet[2821]: E1216 12:20:29.783946 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:20:34.446038 containerd[1576]: time="2025-12-16T12:20:34.445726787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:20:34.804383 containerd[1576]: time="2025-12-16T12:20:34.803257012Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:34.806372 containerd[1576]: time="2025-12-16T12:20:34.806065150Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:20:34.806372 containerd[1576]: time="2025-12-16T12:20:34.806210151Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:34.807439 kubelet[2821]: E1216 12:20:34.807361 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:20:34.807439 kubelet[2821]: E1216 12:20:34.807417 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:20:34.808791 kubelet[2821]: E1216 12:20:34.807767 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-85msp_calico-system(6ad46257-20db-42d2-b357-93e753f0c2ca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:34.809098 kubelet[2821]: E1216 12:20:34.809044 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:20:36.444798 containerd[1576]: time="2025-12-16T12:20:36.444707798Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:20:36.809768 containerd[1576]: time="2025-12-16T12:20:36.809303857Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:36.810969 containerd[1576]: time="2025-12-16T12:20:36.810907029Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:20:36.811579 containerd[1576]: time="2025-12-16T12:20:36.810943909Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:36.811773 kubelet[2821]: E1216 12:20:36.811241 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:20:36.811773 kubelet[2821]: E1216 12:20:36.811297 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:20:36.811773 kubelet[2821]: E1216 12:20:36.811388 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-lzr5b_calico-apiserver(4dd8654f-30cc-4aed-bf7b-e1600d664a65): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:36.812766 kubelet[2821]: E1216 12:20:36.812158 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:20:37.440997 containerd[1576]: time="2025-12-16T12:20:37.440913402Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:20:37.830300 containerd[1576]: time="2025-12-16T12:20:37.830155053Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:37.833508 containerd[1576]: time="2025-12-16T12:20:37.833013794Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:20:37.836121 containerd[1576]: time="2025-12-16T12:20:37.833229596Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:37.836445 kubelet[2821]: E1216 12:20:37.836399 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:20:37.839178 kubelet[2821]: E1216 12:20:37.836772 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:20:37.839178 kubelet[2821]: E1216 12:20:37.838740 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-s9l56_calico-apiserver(b4c4df94-c3ee-426c-b06d-ed9edc99469b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:37.839178 kubelet[2821]: E1216 12:20:37.838785 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:20:37.841503 containerd[1576]: time="2025-12-16T12:20:37.841458577Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:20:38.185510 containerd[1576]: time="2025-12-16T12:20:38.185439115Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:38.187384 containerd[1576]: time="2025-12-16T12:20:38.187294409Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:20:38.187885 containerd[1576]: time="2025-12-16T12:20:38.187347970Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:38.188243 kubelet[2821]: E1216 12:20:38.188187 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:20:38.188482 kubelet[2821]: E1216 12:20:38.188461 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:20:38.189662 kubelet[2821]: E1216 12:20:38.188711 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:38.190455 containerd[1576]: time="2025-12-16T12:20:38.190409634Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:20:38.531956 containerd[1576]: time="2025-12-16T12:20:38.531586524Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:20:38.533800 containerd[1576]: time="2025-12-16T12:20:38.533716661Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:20:38.534166 containerd[1576]: time="2025-12-16T12:20:38.533723821Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:20:38.534409 kubelet[2821]: E1216 12:20:38.534288 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:20:38.534409 kubelet[2821]: E1216 12:20:38.534338 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:20:38.534409 kubelet[2821]: E1216 12:20:38.534422 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:20:38.534862 kubelet[2821]: E1216 12:20:38.534470 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:20:42.444938 kubelet[2821]: E1216 12:20:42.444887 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:20:43.440950 kubelet[2821]: E1216 12:20:43.440661 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:20:48.443953 kubelet[2821]: E1216 12:20:48.443469 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:20:49.442104 kubelet[2821]: E1216 12:20:49.441607 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:20:52.445261 kubelet[2821]: E1216 12:20:52.445199 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:20:53.448903 kubelet[2821]: E1216 12:20:53.448839 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:20:56.448708 kubelet[2821]: E1216 12:20:56.448576 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:20:57.441343 kubelet[2821]: E1216 12:20:57.441271 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:20:59.441686 kubelet[2821]: E1216 12:20:59.441614 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:21:00.442130 kubelet[2821]: E1216 12:21:00.441718 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:21:04.446288 kubelet[2821]: E1216 12:21:04.446214 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:21:04.448991 kubelet[2821]: E1216 12:21:04.448479 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:21:09.440464 kubelet[2821]: E1216 12:21:09.440394 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:21:10.442211 containerd[1576]: time="2025-12-16T12:21:10.441640947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:21:10.789741 containerd[1576]: time="2025-12-16T12:21:10.789544438Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:10.792358 containerd[1576]: time="2025-12-16T12:21:10.792147836Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:21:10.792829 containerd[1576]: time="2025-12-16T12:21:10.792339039Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:10.793873 kubelet[2821]: E1216 12:21:10.793816 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:21:10.794795 kubelet[2821]: E1216 12:21:10.793883 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:21:10.794795 kubelet[2821]: E1216 12:21:10.793983 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:10.798097 containerd[1576]: time="2025-12-16T12:21:10.797723077Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:21:11.124237 containerd[1576]: time="2025-12-16T12:21:11.124041429Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:11.125664 containerd[1576]: time="2025-12-16T12:21:11.125592372Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:21:11.125811 containerd[1576]: time="2025-12-16T12:21:11.125706254Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:11.126174 kubelet[2821]: E1216 12:21:11.126043 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:21:11.126174 kubelet[2821]: E1216 12:21:11.126114 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:21:11.126518 kubelet[2821]: E1216 12:21:11.126186 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:11.126518 kubelet[2821]: E1216 12:21:11.126223 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:21:13.442246 kubelet[2821]: E1216 12:21:13.441768 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:21:15.441606 kubelet[2821]: E1216 12:21:15.441059 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:21:15.442534 kubelet[2821]: E1216 12:21:15.441932 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:21:16.447151 kubelet[2821]: E1216 12:21:16.445597 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:21:24.443585 containerd[1576]: time="2025-12-16T12:21:24.443525320Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:21:24.775781 containerd[1576]: time="2025-12-16T12:21:24.775320873Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:24.777101 containerd[1576]: time="2025-12-16T12:21:24.777017180Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:21:24.777765 containerd[1576]: time="2025-12-16T12:21:24.777199383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:24.778756 kubelet[2821]: E1216 12:21:24.777952 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:21:24.778756 kubelet[2821]: E1216 12:21:24.778026 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:21:24.778756 kubelet[2821]: E1216 12:21:24.778135 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d865fff8d-bxz6x_calico-system(bbfa367f-11d3-466c-9181-c6ee23836f5f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:24.778756 kubelet[2821]: E1216 12:21:24.778210 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:21:25.450872 kubelet[2821]: E1216 12:21:25.450811 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:21:26.445540 containerd[1576]: time="2025-12-16T12:21:26.443932576Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:21:26.795192 containerd[1576]: time="2025-12-16T12:21:26.794688483Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:26.796389 containerd[1576]: time="2025-12-16T12:21:26.796234188Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:21:26.796389 containerd[1576]: time="2025-12-16T12:21:26.796334030Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:26.796995 kubelet[2821]: E1216 12:21:26.796789 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:21:26.796995 kubelet[2821]: E1216 12:21:26.796843 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:21:26.797758 kubelet[2821]: E1216 12:21:26.797013 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-lzr5b_calico-apiserver(4dd8654f-30cc-4aed-bf7b-e1600d664a65): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:26.797758 kubelet[2821]: E1216 12:21:26.797056 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:21:26.798248 containerd[1576]: time="2025-12-16T12:21:26.798008217Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:21:27.133772 containerd[1576]: time="2025-12-16T12:21:27.133003001Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:27.136737 containerd[1576]: time="2025-12-16T12:21:27.136669420Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:21:27.137021 containerd[1576]: time="2025-12-16T12:21:27.136893263Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:27.137369 kubelet[2821]: E1216 12:21:27.137319 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:21:27.137464 kubelet[2821]: E1216 12:21:27.137383 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:21:27.137493 kubelet[2821]: E1216 12:21:27.137466 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-s9l56_calico-apiserver(b4c4df94-c3ee-426c-b06d-ed9edc99469b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:27.137531 kubelet[2821]: E1216 12:21:27.137499 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:21:27.442701 containerd[1576]: time="2025-12-16T12:21:27.442494588Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:21:27.876682 containerd[1576]: time="2025-12-16T12:21:27.876319700Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:27.878045 containerd[1576]: time="2025-12-16T12:21:27.877940646Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:21:27.878347 containerd[1576]: time="2025-12-16T12:21:27.878009607Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:27.878747 kubelet[2821]: E1216 12:21:27.878705 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:21:27.879422 kubelet[2821]: E1216 12:21:27.878751 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:21:27.879422 kubelet[2821]: E1216 12:21:27.878822 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-85msp_calico-system(6ad46257-20db-42d2-b357-93e753f0c2ca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:27.879422 kubelet[2821]: E1216 12:21:27.878852 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:21:28.444672 containerd[1576]: time="2025-12-16T12:21:28.444150002Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:21:28.814794 containerd[1576]: time="2025-12-16T12:21:28.814662199Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:28.816344 containerd[1576]: time="2025-12-16T12:21:28.816283905Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:21:28.816722 containerd[1576]: time="2025-12-16T12:21:28.816332466Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:28.817649 kubelet[2821]: E1216 12:21:28.817072 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:21:28.817649 kubelet[2821]: E1216 12:21:28.817132 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:21:28.817649 kubelet[2821]: E1216 12:21:28.817257 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:28.818824 containerd[1576]: time="2025-12-16T12:21:28.818782586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:21:29.173900 containerd[1576]: time="2025-12-16T12:21:29.173825584Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:29.175521 containerd[1576]: time="2025-12-16T12:21:29.175447810Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:21:29.175722 containerd[1576]: time="2025-12-16T12:21:29.175572252Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:29.176002 kubelet[2821]: E1216 12:21:29.175931 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:21:29.176002 kubelet[2821]: E1216 12:21:29.175992 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:21:29.176417 kubelet[2821]: E1216 12:21:29.176077 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:29.176417 kubelet[2821]: E1216 12:21:29.176119 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:21:36.271567 kernel: kauditd_printk_skb: 207 callbacks suppressed Dec 16 12:21:36.271711 kernel: audit: type=1130 audit(1765887696.269:729): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-46.224.130.63:22-147.75.109.163:55162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:36.269000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-46.224.130.63:22-147.75.109.163:55162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:36.270162 systemd[1]: Started sshd@8-46.224.130.63:22-147.75.109.163:55162.service - OpenSSH per-connection server daemon (147.75.109.163:55162). Dec 16 12:21:37.120431 sshd[5130]: Accepted publickey for core from 147.75.109.163 port 55162 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:21:37.119000 audit[5130]: USER_ACCT pid=5130 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.125744 kernel: audit: type=1101 audit(1765887697.119:730): pid=5130 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.125830 kernel: audit: type=1103 audit(1765887697.122:731): pid=5130 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.122000 audit[5130]: CRED_ACQ pid=5130 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.124542 sshd-session[5130]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:21:37.127868 kernel: audit: type=1006 audit(1765887697.123:732): pid=5130 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Dec 16 12:21:37.131457 kernel: audit: type=1300 audit(1765887697.123:732): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff034d290 a2=3 a3=0 items=0 ppid=1 pid=5130 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:37.123000 audit[5130]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff034d290 a2=3 a3=0 items=0 ppid=1 pid=5130 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:37.123000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:37.132498 systemd-logind[1543]: New session 9 of user core. Dec 16 12:21:37.132795 kernel: audit: type=1327 audit(1765887697.123:732): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:37.136917 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 12:21:37.142000 audit[5130]: USER_START pid=5130 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.142000 audit[5134]: CRED_ACQ pid=5134 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.148612 kernel: audit: type=1105 audit(1765887697.142:733): pid=5130 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.148733 kernel: audit: type=1103 audit(1765887697.142:734): pid=5134 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.442365 kubelet[2821]: E1216 12:21:37.441489 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:21:37.699169 sshd[5134]: Connection closed by 147.75.109.163 port 55162 Dec 16 12:21:37.702058 sshd-session[5130]: pam_unix(sshd:session): session closed for user core Dec 16 12:21:37.703000 audit[5130]: USER_END pid=5130 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.703000 audit[5130]: CRED_DISP pid=5130 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.715145 kernel: audit: type=1106 audit(1765887697.703:735): pid=5130 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.715286 kernel: audit: type=1104 audit(1765887697.703:736): pid=5130 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:37.716687 systemd[1]: sshd@8-46.224.130.63:22-147.75.109.163:55162.service: Deactivated successfully. Dec 16 12:21:37.720886 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 12:21:37.716000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-46.224.130.63:22-147.75.109.163:55162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:37.727853 systemd-logind[1543]: Session 9 logged out. Waiting for processes to exit. Dec 16 12:21:37.730212 systemd-logind[1543]: Removed session 9. Dec 16 12:21:39.443376 kubelet[2821]: E1216 12:21:39.442768 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:21:39.444908 kubelet[2821]: E1216 12:21:39.444037 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:21:40.447650 kubelet[2821]: E1216 12:21:40.445938 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:21:40.447650 kubelet[2821]: E1216 12:21:40.446864 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:21:41.452381 kubelet[2821]: E1216 12:21:41.452323 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:21:42.872108 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:21:42.872215 kernel: audit: type=1130 audit(1765887702.870:738): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-46.224.130.63:22-147.75.109.163:58914 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:42.870000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-46.224.130.63:22-147.75.109.163:58914 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:42.871471 systemd[1]: Started sshd@9-46.224.130.63:22-147.75.109.163:58914.service - OpenSSH per-connection server daemon (147.75.109.163:58914). Dec 16 12:21:43.727000 audit[5149]: USER_ACCT pid=5149 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:43.728364 sshd[5149]: Accepted publickey for core from 147.75.109.163 port 58914 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:21:43.730755 kernel: audit: type=1101 audit(1765887703.727:739): pid=5149 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:43.730000 audit[5149]: CRED_ACQ pid=5149 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:43.733313 sshd-session[5149]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:21:43.734671 kernel: audit: type=1103 audit(1765887703.730:740): pid=5149 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:43.734738 kernel: audit: type=1006 audit(1765887703.730:741): pid=5149 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Dec 16 12:21:43.734759 kernel: audit: type=1300 audit(1765887703.730:741): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe99a6d90 a2=3 a3=0 items=0 ppid=1 pid=5149 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:43.730000 audit[5149]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe99a6d90 a2=3 a3=0 items=0 ppid=1 pid=5149 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:43.738014 kernel: audit: type=1327 audit(1765887703.730:741): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:43.730000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:43.747171 systemd-logind[1543]: New session 10 of user core. Dec 16 12:21:43.752876 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 12:21:43.759000 audit[5149]: USER_START pid=5149 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:43.764000 audit[5153]: CRED_ACQ pid=5153 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:43.766935 kernel: audit: type=1105 audit(1765887703.759:742): pid=5149 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:43.767042 kernel: audit: type=1103 audit(1765887703.764:743): pid=5153 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:44.295726 sshd[5153]: Connection closed by 147.75.109.163 port 58914 Dec 16 12:21:44.296973 sshd-session[5149]: pam_unix(sshd:session): session closed for user core Dec 16 12:21:44.300000 audit[5149]: USER_END pid=5149 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:44.300000 audit[5149]: CRED_DISP pid=5149 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:44.305171 kernel: audit: type=1106 audit(1765887704.300:744): pid=5149 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:44.305237 kernel: audit: type=1104 audit(1765887704.300:745): pid=5149 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:44.307088 systemd[1]: sshd@9-46.224.130.63:22-147.75.109.163:58914.service: Deactivated successfully. Dec 16 12:21:44.306000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-46.224.130.63:22-147.75.109.163:58914 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:44.310578 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 12:21:44.312286 systemd-logind[1543]: Session 10 logged out. Waiting for processes to exit. Dec 16 12:21:44.314391 systemd-logind[1543]: Removed session 10. Dec 16 12:21:49.471932 systemd[1]: Started sshd@10-46.224.130.63:22-147.75.109.163:58918.service - OpenSSH per-connection server daemon (147.75.109.163:58918). Dec 16 12:21:49.471000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-46.224.130.63:22-147.75.109.163:58918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:49.473743 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:21:49.473819 kernel: audit: type=1130 audit(1765887709.471:747): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-46.224.130.63:22-147.75.109.163:58918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:50.321000 audit[5167]: USER_ACCT pid=5167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.323804 sshd[5167]: Accepted publickey for core from 147.75.109.163 port 58918 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:21:50.325685 kernel: audit: type=1101 audit(1765887710.321:748): pid=5167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.325000 audit[5167]: CRED_ACQ pid=5167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.326934 sshd-session[5167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:21:50.329922 kernel: audit: type=1103 audit(1765887710.325:749): pid=5167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.330005 kernel: audit: type=1006 audit(1765887710.325:750): pid=5167 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Dec 16 12:21:50.325000 audit[5167]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff18e5600 a2=3 a3=0 items=0 ppid=1 pid=5167 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:50.332589 kernel: audit: type=1300 audit(1765887710.325:750): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff18e5600 a2=3 a3=0 items=0 ppid=1 pid=5167 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:50.333787 kernel: audit: type=1327 audit(1765887710.325:750): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:50.325000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:50.339841 systemd-logind[1543]: New session 11 of user core. Dec 16 12:21:50.345998 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 12:21:50.349000 audit[5167]: USER_START pid=5167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.354797 kernel: audit: type=1105 audit(1765887710.349:751): pid=5167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.356000 audit[5172]: CRED_ACQ pid=5172 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.361678 kernel: audit: type=1103 audit(1765887710.356:752): pid=5172 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.447635 kubelet[2821]: E1216 12:21:50.447465 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:21:50.898799 sshd[5172]: Connection closed by 147.75.109.163 port 58918 Dec 16 12:21:50.899544 sshd-session[5167]: pam_unix(sshd:session): session closed for user core Dec 16 12:21:50.901000 audit[5167]: USER_END pid=5167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.911502 kernel: audit: type=1106 audit(1765887710.901:753): pid=5167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.911663 kernel: audit: type=1104 audit(1765887710.901:754): pid=5167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.901000 audit[5167]: CRED_DISP pid=5167 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:50.912200 systemd-logind[1543]: Session 11 logged out. Waiting for processes to exit. Dec 16 12:21:50.912514 systemd[1]: sshd@10-46.224.130.63:22-147.75.109.163:58918.service: Deactivated successfully. Dec 16 12:21:50.915000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-46.224.130.63:22-147.75.109.163:58918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:50.919317 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 12:21:50.923433 systemd-logind[1543]: Removed session 11. Dec 16 12:21:51.071173 systemd[1]: Started sshd@11-46.224.130.63:22-147.75.109.163:58928.service - OpenSSH per-connection server daemon (147.75.109.163:58928). Dec 16 12:21:51.070000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-46.224.130.63:22-147.75.109.163:58928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:51.443337 kubelet[2821]: E1216 12:21:51.443282 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:21:51.923000 audit[5185]: USER_ACCT pid=5185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:51.925187 sshd[5185]: Accepted publickey for core from 147.75.109.163 port 58928 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:21:51.926000 audit[5185]: CRED_ACQ pid=5185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:51.927000 audit[5185]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc00e5880 a2=3 a3=0 items=0 ppid=1 pid=5185 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:51.927000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:51.929432 sshd-session[5185]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:21:51.935390 systemd-logind[1543]: New session 12 of user core. Dec 16 12:21:51.939868 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 12:21:51.942000 audit[5185]: USER_START pid=5185 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:51.945000 audit[5189]: CRED_ACQ pid=5189 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:52.445000 kubelet[2821]: E1216 12:21:52.444758 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:21:52.531714 sshd[5189]: Connection closed by 147.75.109.163 port 58928 Dec 16 12:21:52.532817 sshd-session[5185]: pam_unix(sshd:session): session closed for user core Dec 16 12:21:52.534000 audit[5185]: USER_END pid=5185 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:52.534000 audit[5185]: CRED_DISP pid=5185 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:52.539341 systemd[1]: sshd@11-46.224.130.63:22-147.75.109.163:58928.service: Deactivated successfully. Dec 16 12:21:52.540000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-46.224.130.63:22-147.75.109.163:58928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:52.544996 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 12:21:52.552369 systemd-logind[1543]: Session 12 logged out. Waiting for processes to exit. Dec 16 12:21:52.556038 systemd-logind[1543]: Removed session 12. Dec 16 12:21:52.704000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-46.224.130.63:22-147.75.109.163:52360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:52.704945 systemd[1]: Started sshd@12-46.224.130.63:22-147.75.109.163:52360.service - OpenSSH per-connection server daemon (147.75.109.163:52360). Dec 16 12:21:53.443031 kubelet[2821]: E1216 12:21:53.441812 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:21:53.443031 kubelet[2821]: E1216 12:21:53.442851 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:21:53.443031 kubelet[2821]: E1216 12:21:53.442992 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:21:53.560000 audit[5199]: USER_ACCT pid=5199 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:53.561288 sshd[5199]: Accepted publickey for core from 147.75.109.163 port 52360 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:21:53.562000 audit[5199]: CRED_ACQ pid=5199 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:53.563000 audit[5199]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff173a830 a2=3 a3=0 items=0 ppid=1 pid=5199 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:53.563000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:53.565258 sshd-session[5199]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:21:53.572762 systemd-logind[1543]: New session 13 of user core. Dec 16 12:21:53.578189 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 12:21:53.586000 audit[5199]: USER_START pid=5199 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:53.588000 audit[5228]: CRED_ACQ pid=5228 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:54.120137 sshd[5228]: Connection closed by 147.75.109.163 port 52360 Dec 16 12:21:54.121124 sshd-session[5199]: pam_unix(sshd:session): session closed for user core Dec 16 12:21:54.122000 audit[5199]: USER_END pid=5199 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:54.123000 audit[5199]: CRED_DISP pid=5199 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:21:54.126851 systemd-logind[1543]: Session 13 logged out. Waiting for processes to exit. Dec 16 12:21:54.127954 systemd[1]: sshd@12-46.224.130.63:22-147.75.109.163:52360.service: Deactivated successfully. Dec 16 12:21:54.127000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-46.224.130.63:22-147.75.109.163:52360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:54.132257 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 12:21:54.136557 systemd-logind[1543]: Removed session 13. Dec 16 12:21:59.292696 systemd[1]: Started sshd@13-46.224.130.63:22-147.75.109.163:52364.service - OpenSSH per-connection server daemon (147.75.109.163:52364). Dec 16 12:21:59.291000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-46.224.130.63:22-147.75.109.163:52364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:59.294826 kernel: kauditd_printk_skb: 23 callbacks suppressed Dec 16 12:21:59.294989 kernel: audit: type=1130 audit(1765887719.291:774): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-46.224.130.63:22-147.75.109.163:52364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:00.131000 audit[5243]: USER_ACCT pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.137822 kernel: audit: type=1101 audit(1765887720.131:775): pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.138709 sshd[5243]: Accepted publickey for core from 147.75.109.163 port 52364 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:22:00.138000 audit[5243]: CRED_ACQ pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.143027 sshd-session[5243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:00.144388 kernel: audit: type=1103 audit(1765887720.138:776): pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.144484 kernel: audit: type=1006 audit(1765887720.138:777): pid=5243 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Dec 16 12:22:00.138000 audit[5243]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffc7cb800 a2=3 a3=0 items=0 ppid=1 pid=5243 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:00.146746 kernel: audit: type=1300 audit(1765887720.138:777): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffc7cb800 a2=3 a3=0 items=0 ppid=1 pid=5243 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:00.138000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:00.148732 kernel: audit: type=1327 audit(1765887720.138:777): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:00.155794 systemd-logind[1543]: New session 14 of user core. Dec 16 12:22:00.161912 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 12:22:00.166000 audit[5243]: USER_START pid=5243 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.170000 audit[5247]: CRED_ACQ pid=5247 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.174846 kernel: audit: type=1105 audit(1765887720.166:778): pid=5243 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.174909 kernel: audit: type=1103 audit(1765887720.170:779): pid=5247 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.705562 sshd[5247]: Connection closed by 147.75.109.163 port 52364 Dec 16 12:22:00.706813 sshd-session[5243]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:00.708000 audit[5243]: USER_END pid=5243 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.714557 systemd[1]: sshd@13-46.224.130.63:22-147.75.109.163:52364.service: Deactivated successfully. Dec 16 12:22:00.708000 audit[5243]: CRED_DISP pid=5243 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.716719 kernel: audit: type=1106 audit(1765887720.708:780): pid=5243 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.716800 kernel: audit: type=1104 audit(1765887720.708:781): pid=5243 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:00.715000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-46.224.130.63:22-147.75.109.163:52364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:00.720978 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 12:22:00.725496 systemd-logind[1543]: Session 14 logged out. Waiting for processes to exit. Dec 16 12:22:00.729677 systemd-logind[1543]: Removed session 14. Dec 16 12:22:02.448659 kubelet[2821]: E1216 12:22:02.445996 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:22:02.448659 kubelet[2821]: E1216 12:22:02.446328 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:22:05.872000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-46.224.130.63:22-147.75.109.163:39920 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:05.873854 systemd[1]: Started sshd@14-46.224.130.63:22-147.75.109.163:39920.service - OpenSSH per-connection server daemon (147.75.109.163:39920). Dec 16 12:22:05.877170 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:22:05.877297 kernel: audit: type=1130 audit(1765887725.872:783): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-46.224.130.63:22-147.75.109.163:39920 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:06.714000 audit[5261]: USER_ACCT pid=5261 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:06.717820 sshd[5261]: Accepted publickey for core from 147.75.109.163 port 39920 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:22:06.719667 kernel: audit: type=1101 audit(1765887726.714:784): pid=5261 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:06.721035 sshd-session[5261]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:06.718000 audit[5261]: CRED_ACQ pid=5261 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:06.725705 kernel: audit: type=1103 audit(1765887726.718:785): pid=5261 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:06.725858 kernel: audit: type=1006 audit(1765887726.718:786): pid=5261 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Dec 16 12:22:06.718000 audit[5261]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3cb17e0 a2=3 a3=0 items=0 ppid=1 pid=5261 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:06.727962 kernel: audit: type=1300 audit(1765887726.718:786): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3cb17e0 a2=3 a3=0 items=0 ppid=1 pid=5261 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:06.718000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:06.730669 kernel: audit: type=1327 audit(1765887726.718:786): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:06.734813 systemd-logind[1543]: New session 15 of user core. Dec 16 12:22:06.740886 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 12:22:06.744000 audit[5261]: USER_START pid=5261 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:06.748000 audit[5265]: CRED_ACQ pid=5265 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:06.752406 kernel: audit: type=1105 audit(1765887726.744:787): pid=5261 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:06.752541 kernel: audit: type=1103 audit(1765887726.748:788): pid=5265 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:07.286857 sshd[5265]: Connection closed by 147.75.109.163 port 39920 Dec 16 12:22:07.287594 sshd-session[5261]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:07.287000 audit[5261]: USER_END pid=5261 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:07.287000 audit[5261]: CRED_DISP pid=5261 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:07.296979 kernel: audit: type=1106 audit(1765887727.287:789): pid=5261 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:07.298896 kernel: audit: type=1104 audit(1765887727.287:790): pid=5261 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:07.296000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-46.224.130.63:22-147.75.109.163:39920 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:07.297959 systemd[1]: sshd@14-46.224.130.63:22-147.75.109.163:39920.service: Deactivated successfully. Dec 16 12:22:07.300958 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 12:22:07.303822 systemd-logind[1543]: Session 15 logged out. Waiting for processes to exit. Dec 16 12:22:07.307184 systemd-logind[1543]: Removed session 15. Dec 16 12:22:07.440596 kubelet[2821]: E1216 12:22:07.440524 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:22:07.442355 kubelet[2821]: E1216 12:22:07.442304 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:22:08.443287 kubelet[2821]: E1216 12:22:08.443131 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:22:08.443287 kubelet[2821]: E1216 12:22:08.443246 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:22:12.456432 systemd[1]: Started sshd@15-46.224.130.63:22-147.75.109.163:60158.service - OpenSSH per-connection server daemon (147.75.109.163:60158). Dec 16 12:22:12.455000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-46.224.130.63:22-147.75.109.163:60158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:12.458909 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:22:12.458948 kernel: audit: type=1130 audit(1765887732.455:792): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-46.224.130.63:22-147.75.109.163:60158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:13.307000 audit[5279]: USER_ACCT pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.312419 sshd[5279]: Accepted publickey for core from 147.75.109.163 port 60158 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:22:13.313728 kernel: audit: type=1101 audit(1765887733.307:793): pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.313000 audit[5279]: CRED_ACQ pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.319012 sshd-session[5279]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:13.320767 kernel: audit: type=1103 audit(1765887733.313:794): pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.320852 kernel: audit: type=1006 audit(1765887733.313:795): pid=5279 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Dec 16 12:22:13.313000 audit[5279]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc72485e0 a2=3 a3=0 items=0 ppid=1 pid=5279 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:13.323547 kernel: audit: type=1300 audit(1765887733.313:795): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc72485e0 a2=3 a3=0 items=0 ppid=1 pid=5279 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:13.324678 kernel: audit: type=1327 audit(1765887733.313:795): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:13.313000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:13.332232 systemd-logind[1543]: New session 16 of user core. Dec 16 12:22:13.335875 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 12:22:13.339000 audit[5279]: USER_START pid=5279 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.343000 audit[5283]: CRED_ACQ pid=5283 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.346935 kernel: audit: type=1105 audit(1765887733.339:796): pid=5279 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.347001 kernel: audit: type=1103 audit(1765887733.343:797): pid=5283 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.869694 sshd[5283]: Connection closed by 147.75.109.163 port 60158 Dec 16 12:22:13.870868 sshd-session[5279]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:13.870000 audit[5279]: USER_END pid=5279 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.871000 audit[5279]: CRED_DISP pid=5279 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.877406 kernel: audit: type=1106 audit(1765887733.870:798): pid=5279 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.877480 kernel: audit: type=1104 audit(1765887733.871:799): pid=5279 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:13.878063 systemd-logind[1543]: Session 16 logged out. Waiting for processes to exit. Dec 16 12:22:13.878298 systemd[1]: sshd@15-46.224.130.63:22-147.75.109.163:60158.service: Deactivated successfully. Dec 16 12:22:13.877000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-46.224.130.63:22-147.75.109.163:60158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:13.881320 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 12:22:13.886141 systemd-logind[1543]: Removed session 16. Dec 16 12:22:14.044000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-46.224.130.63:22-147.75.109.163:60160 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:14.045720 systemd[1]: Started sshd@16-46.224.130.63:22-147.75.109.163:60160.service - OpenSSH per-connection server daemon (147.75.109.163:60160). Dec 16 12:22:14.893000 audit[5295]: USER_ACCT pid=5295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:14.895707 sshd[5295]: Accepted publickey for core from 147.75.109.163 port 60160 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:22:14.897000 audit[5295]: CRED_ACQ pid=5295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:14.897000 audit[5295]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdb7d20e0 a2=3 a3=0 items=0 ppid=1 pid=5295 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:14.897000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:14.900132 sshd-session[5295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:14.907899 systemd-logind[1543]: New session 17 of user core. Dec 16 12:22:14.919892 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 12:22:14.923000 audit[5295]: USER_START pid=5295 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:14.926000 audit[5299]: CRED_ACQ pid=5299 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:15.624332 sshd[5299]: Connection closed by 147.75.109.163 port 60160 Dec 16 12:22:15.626873 sshd-session[5295]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:15.627000 audit[5295]: USER_END pid=5295 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:15.627000 audit[5295]: CRED_DISP pid=5295 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:15.635270 systemd-logind[1543]: Session 17 logged out. Waiting for processes to exit. Dec 16 12:22:15.635874 systemd[1]: sshd@16-46.224.130.63:22-147.75.109.163:60160.service: Deactivated successfully. Dec 16 12:22:15.636000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-46.224.130.63:22-147.75.109.163:60160 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:15.640677 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 12:22:15.644247 systemd-logind[1543]: Removed session 17. Dec 16 12:22:15.795000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-46.224.130.63:22-147.75.109.163:60176 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:15.796543 systemd[1]: Started sshd@17-46.224.130.63:22-147.75.109.163:60176.service - OpenSSH per-connection server daemon (147.75.109.163:60176). Dec 16 12:22:16.442752 kubelet[2821]: E1216 12:22:16.442450 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:22:16.646000 audit[5309]: USER_ACCT pid=5309 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:16.648401 sshd[5309]: Accepted publickey for core from 147.75.109.163 port 60176 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:22:16.648000 audit[5309]: CRED_ACQ pid=5309 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:16.648000 audit[5309]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff1a441d0 a2=3 a3=0 items=0 ppid=1 pid=5309 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:16.648000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:16.651409 sshd-session[5309]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:16.658282 systemd-logind[1543]: New session 18 of user core. Dec 16 12:22:16.663903 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 12:22:16.667000 audit[5309]: USER_START pid=5309 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:16.670000 audit[5313]: CRED_ACQ pid=5313 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:17.440964 kubelet[2821]: E1216 12:22:17.440818 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:22:17.682000 audit[5324]: NETFILTER_CFG table=filter:135 family=2 entries=26 op=nft_register_rule pid=5324 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:22:17.684941 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:22:17.685018 kernel: audit: type=1325 audit(1765887737.682:816): table=filter:135 family=2 entries=26 op=nft_register_rule pid=5324 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:22:17.682000 audit[5324]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff685ff90 a2=0 a3=1 items=0 ppid=2972 pid=5324 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:17.689393 kernel: audit: type=1300 audit(1765887737.682:816): arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff685ff90 a2=0 a3=1 items=0 ppid=2972 pid=5324 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:17.682000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:22:17.697878 kernel: audit: type=1327 audit(1765887737.682:816): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:22:17.699000 audit[5324]: NETFILTER_CFG table=nat:136 family=2 entries=20 op=nft_register_rule pid=5324 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:22:17.699000 audit[5324]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff685ff90 a2=0 a3=1 items=0 ppid=2972 pid=5324 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:17.705076 kernel: audit: type=1325 audit(1765887737.699:817): table=nat:136 family=2 entries=20 op=nft_register_rule pid=5324 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:22:17.705165 kernel: audit: type=1300 audit(1765887737.699:817): arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff685ff90 a2=0 a3=1 items=0 ppid=2972 pid=5324 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:17.706707 kernel: audit: type=1327 audit(1765887737.699:817): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:22:17.699000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:22:17.871105 sshd[5313]: Connection closed by 147.75.109.163 port 60176 Dec 16 12:22:17.872014 sshd-session[5309]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:17.875000 audit[5309]: USER_END pid=5309 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:17.882010 systemd[1]: sshd@17-46.224.130.63:22-147.75.109.163:60176.service: Deactivated successfully. Dec 16 12:22:17.875000 audit[5309]: CRED_DISP pid=5309 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:17.885467 kernel: audit: type=1106 audit(1765887737.875:818): pid=5309 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:17.885557 kernel: audit: type=1104 audit(1765887737.875:819): pid=5309 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:17.881000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-46.224.130.63:22-147.75.109.163:60176 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:17.888585 kernel: audit: type=1131 audit(1765887737.881:820): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-46.224.130.63:22-147.75.109.163:60176 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:17.889062 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 12:22:17.892033 systemd-logind[1543]: Session 18 logged out. Waiting for processes to exit. Dec 16 12:22:17.896058 systemd-logind[1543]: Removed session 18. Dec 16 12:22:18.035495 systemd[1]: Started sshd@18-46.224.130.63:22-147.75.109.163:60182.service - OpenSSH per-connection server daemon (147.75.109.163:60182). Dec 16 12:22:18.034000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-46.224.130.63:22-147.75.109.163:60182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:18.038669 kernel: audit: type=1130 audit(1765887738.034:821): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-46.224.130.63:22-147.75.109.163:60182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:18.444769 kubelet[2821]: E1216 12:22:18.444691 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:22:18.815000 audit[5333]: NETFILTER_CFG table=filter:137 family=2 entries=38 op=nft_register_rule pid=5333 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:22:18.815000 audit[5333]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffd828fc10 a2=0 a3=1 items=0 ppid=2972 pid=5333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:18.815000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:22:18.821000 audit[5333]: NETFILTER_CFG table=nat:138 family=2 entries=20 op=nft_register_rule pid=5333 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:22:18.821000 audit[5333]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffd828fc10 a2=0 a3=1 items=0 ppid=2972 pid=5333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:18.821000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:22:18.878000 audit[5329]: USER_ACCT pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:18.881681 sshd[5329]: Accepted publickey for core from 147.75.109.163 port 60182 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:22:18.880000 audit[5329]: CRED_ACQ pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:18.880000 audit[5329]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd06addf0 a2=3 a3=0 items=0 ppid=1 pid=5329 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:18.880000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:18.884063 sshd-session[5329]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:18.892661 systemd-logind[1543]: New session 19 of user core. Dec 16 12:22:18.896859 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 12:22:18.898000 audit[5329]: USER_START pid=5329 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:18.901000 audit[5335]: CRED_ACQ pid=5335 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:19.443931 kubelet[2821]: E1216 12:22:19.443849 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:22:19.619911 sshd[5335]: Connection closed by 147.75.109.163 port 60182 Dec 16 12:22:19.621353 sshd-session[5329]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:19.624000 audit[5329]: USER_END pid=5329 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:19.624000 audit[5329]: CRED_DISP pid=5329 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:19.630608 systemd-logind[1543]: Session 19 logged out. Waiting for processes to exit. Dec 16 12:22:19.631253 systemd[1]: sshd@18-46.224.130.63:22-147.75.109.163:60182.service: Deactivated successfully. Dec 16 12:22:19.632000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-46.224.130.63:22-147.75.109.163:60182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:19.638051 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 12:22:19.646092 systemd-logind[1543]: Removed session 19. Dec 16 12:22:19.791000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-46.224.130.63:22-147.75.109.163:60198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:19.792942 systemd[1]: Started sshd@19-46.224.130.63:22-147.75.109.163:60198.service - OpenSSH per-connection server daemon (147.75.109.163:60198). Dec 16 12:22:20.441813 kubelet[2821]: E1216 12:22:20.441751 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:22:20.648000 audit[5345]: USER_ACCT pid=5345 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:20.650840 sshd[5345]: Accepted publickey for core from 147.75.109.163 port 60198 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:22:20.651000 audit[5345]: CRED_ACQ pid=5345 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:20.651000 audit[5345]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffbe18900 a2=3 a3=0 items=0 ppid=1 pid=5345 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:20.651000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:20.654113 sshd-session[5345]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:20.663880 systemd-logind[1543]: New session 20 of user core. Dec 16 12:22:20.669969 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 12:22:20.672000 audit[5345]: USER_START pid=5345 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:20.675000 audit[5350]: CRED_ACQ pid=5350 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:21.215743 sshd[5350]: Connection closed by 147.75.109.163 port 60198 Dec 16 12:22:21.216727 sshd-session[5345]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:21.217000 audit[5345]: USER_END pid=5345 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:21.217000 audit[5345]: CRED_DISP pid=5345 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:21.223590 systemd[1]: sshd@19-46.224.130.63:22-147.75.109.163:60198.service: Deactivated successfully. Dec 16 12:22:21.222000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-46.224.130.63:22-147.75.109.163:60198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:21.229030 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 12:22:21.231497 systemd-logind[1543]: Session 20 logged out. Waiting for processes to exit. Dec 16 12:22:21.235223 systemd-logind[1543]: Removed session 20. Dec 16 12:22:21.441456 kubelet[2821]: E1216 12:22:21.441407 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:22:22.002000 audit[5361]: NETFILTER_CFG table=filter:139 family=2 entries=26 op=nft_register_rule pid=5361 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:22:22.002000 audit[5361]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd5f2f6f0 a2=0 a3=1 items=0 ppid=2972 pid=5361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:22.002000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:22:22.013000 audit[5361]: NETFILTER_CFG table=nat:140 family=2 entries=104 op=nft_register_chain pid=5361 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:22:22.013000 audit[5361]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffd5f2f6f0 a2=0 a3=1 items=0 ppid=2972 pid=5361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:22.013000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:22:26.388984 systemd[1]: Started sshd@20-46.224.130.63:22-147.75.109.163:56192.service - OpenSSH per-connection server daemon (147.75.109.163:56192). Dec 16 12:22:26.391519 kernel: kauditd_printk_skb: 33 callbacks suppressed Dec 16 12:22:26.391552 kernel: audit: type=1130 audit(1765887746.387:843): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-46.224.130.63:22-147.75.109.163:56192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:26.387000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-46.224.130.63:22-147.75.109.163:56192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:27.233000 audit[5386]: USER_ACCT pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.238774 sshd[5386]: Accepted publickey for core from 147.75.109.163 port 56192 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:22:27.239729 kernel: audit: type=1101 audit(1765887747.233:844): pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.239000 audit[5386]: CRED_ACQ pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.244598 sshd-session[5386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:27.246937 kernel: audit: type=1103 audit(1765887747.239:845): pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.247024 kernel: audit: type=1006 audit(1765887747.239:846): pid=5386 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Dec 16 12:22:27.239000 audit[5386]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdb74e180 a2=3 a3=0 items=0 ppid=1 pid=5386 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:27.250024 kernel: audit: type=1300 audit(1765887747.239:846): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdb74e180 a2=3 a3=0 items=0 ppid=1 pid=5386 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:27.239000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:27.251045 kernel: audit: type=1327 audit(1765887747.239:846): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:27.256829 systemd-logind[1543]: New session 21 of user core. Dec 16 12:22:27.261185 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 12:22:27.264000 audit[5386]: USER_START pid=5386 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.269000 audit[5390]: CRED_ACQ pid=5390 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.273602 kernel: audit: type=1105 audit(1765887747.264:847): pid=5386 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.273734 kernel: audit: type=1103 audit(1765887747.269:848): pid=5390 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.805074 sshd[5390]: Connection closed by 147.75.109.163 port 56192 Dec 16 12:22:27.807963 sshd-session[5386]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:27.809000 audit[5386]: USER_END pid=5386 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.809000 audit[5386]: CRED_DISP pid=5386 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.820096 kernel: audit: type=1106 audit(1765887747.809:849): pid=5386 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.820287 kernel: audit: type=1104 audit(1765887747.809:850): pid=5386 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:27.819803 systemd[1]: sshd@20-46.224.130.63:22-147.75.109.163:56192.service: Deactivated successfully. Dec 16 12:22:27.819000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-46.224.130.63:22-147.75.109.163:56192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:27.825145 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 12:22:27.826494 systemd-logind[1543]: Session 21 logged out. Waiting for processes to exit. Dec 16 12:22:27.828847 systemd-logind[1543]: Removed session 21. Dec 16 12:22:31.441017 kubelet[2821]: E1216 12:22:31.440944 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:22:31.441883 kubelet[2821]: E1216 12:22:31.441468 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:22:32.448616 kubelet[2821]: E1216 12:22:32.448386 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:22:32.983000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-46.224.130.63:22-147.75.109.163:32842 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:32.984083 systemd[1]: Started sshd@21-46.224.130.63:22-147.75.109.163:32842.service - OpenSSH per-connection server daemon (147.75.109.163:32842). Dec 16 12:22:32.987287 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:22:32.987470 kernel: audit: type=1130 audit(1765887752.983:852): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-46.224.130.63:22-147.75.109.163:32842 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:33.441747 kubelet[2821]: E1216 12:22:33.441594 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:22:33.442927 containerd[1576]: time="2025-12-16T12:22:33.442778199Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:22:33.781901 containerd[1576]: time="2025-12-16T12:22:33.781045652Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:22:33.783408 containerd[1576]: time="2025-12-16T12:22:33.783321300Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:22:33.783506 containerd[1576]: time="2025-12-16T12:22:33.783438701Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:22:33.784076 kubelet[2821]: E1216 12:22:33.783713 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:22:33.784076 kubelet[2821]: E1216 12:22:33.783791 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:22:33.784076 kubelet[2821]: E1216 12:22:33.783897 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:22:33.785655 containerd[1576]: time="2025-12-16T12:22:33.785477388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:22:33.842000 audit[5402]: USER_ACCT pid=5402 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:33.846681 sshd[5402]: Accepted publickey for core from 147.75.109.163 port 32842 ssh2: RSA SHA256:Tx2BWscHxMi4pW0J1Au8h0VXMqK5+1v+Um0l7o/SYzc Dec 16 12:22:33.849818 kernel: audit: type=1101 audit(1765887753.842:853): pid=5402 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:33.849960 kernel: audit: type=1103 audit(1765887753.846:854): pid=5402 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:33.846000 audit[5402]: CRED_ACQ pid=5402 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:33.848871 sshd-session[5402]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:33.852582 kernel: audit: type=1006 audit(1765887753.846:855): pid=5402 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Dec 16 12:22:33.855603 kernel: audit: type=1300 audit(1765887753.846:855): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcf52f9f0 a2=3 a3=0 items=0 ppid=1 pid=5402 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:33.846000 audit[5402]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcf52f9f0 a2=3 a3=0 items=0 ppid=1 pid=5402 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:22:33.858658 kernel: audit: type=1327 audit(1765887753.846:855): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:33.846000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:22:33.862667 systemd-logind[1543]: New session 22 of user core. Dec 16 12:22:33.867881 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 16 12:22:33.872000 audit[5402]: USER_START pid=5402 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:33.876000 audit[5406]: CRED_ACQ pid=5406 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:33.879632 kernel: audit: type=1105 audit(1765887753.872:856): pid=5402 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:33.879702 kernel: audit: type=1103 audit(1765887753.876:857): pid=5406 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:34.160561 containerd[1576]: time="2025-12-16T12:22:34.160189718Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:22:34.162699 containerd[1576]: time="2025-12-16T12:22:34.161766164Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:22:34.163490 containerd[1576]: time="2025-12-16T12:22:34.161830884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:22:34.163590 kubelet[2821]: E1216 12:22:34.163017 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:22:34.163590 kubelet[2821]: E1216 12:22:34.163072 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:22:34.163590 kubelet[2821]: E1216 12:22:34.163164 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-85dd648564-9wttf_calico-system(b32c8c20-6807-4c19-8ec5-b6f0be7cc07e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:22:34.163590 kubelet[2821]: E1216 12:22:34.163210 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:22:34.430194 sshd[5406]: Connection closed by 147.75.109.163 port 32842 Dec 16 12:22:34.430830 sshd-session[5402]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:34.433000 audit[5402]: USER_END pid=5402 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:34.433000 audit[5402]: CRED_DISP pid=5402 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:34.441440 kubelet[2821]: E1216 12:22:34.440313 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:22:34.443040 kernel: audit: type=1106 audit(1765887754.433:858): pid=5402 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:34.443102 kernel: audit: type=1104 audit(1765887754.433:859): pid=5402 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:22:34.442000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-46.224.130.63:22-147.75.109.163:32842 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:22:34.443773 systemd[1]: sshd@21-46.224.130.63:22-147.75.109.163:32842.service: Deactivated successfully. Dec 16 12:22:34.443931 systemd-logind[1543]: Session 22 logged out. Waiting for processes to exit. Dec 16 12:22:34.451741 systemd[1]: session-22.scope: Deactivated successfully. Dec 16 12:22:34.454805 systemd-logind[1543]: Removed session 22. Dec 16 12:22:42.445293 kubelet[2821]: E1216 12:22:42.445243 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:22:46.443296 kubelet[2821]: E1216 12:22:46.442999 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:22:46.446442 kubelet[2821]: E1216 12:22:46.443034 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:22:46.446442 kubelet[2821]: E1216 12:22:46.445144 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:22:47.441850 containerd[1576]: time="2025-12-16T12:22:47.441793861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:22:47.445264 kubelet[2821]: E1216 12:22:47.442548 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:22:47.789247 containerd[1576]: time="2025-12-16T12:22:47.788750082Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:22:47.791235 containerd[1576]: time="2025-12-16T12:22:47.791173657Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:22:47.791444 containerd[1576]: time="2025-12-16T12:22:47.791315017Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:22:47.791737 kubelet[2821]: E1216 12:22:47.791672 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:22:47.791900 kubelet[2821]: E1216 12:22:47.791723 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:22:47.792015 kubelet[2821]: E1216 12:22:47.791995 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-s9l56_calico-apiserver(b4c4df94-c3ee-426c-b06d-ed9edc99469b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:22:47.792128 kubelet[2821]: E1216 12:22:47.792106 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:22:55.441644 containerd[1576]: time="2025-12-16T12:22:55.441347351Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:22:55.771190 containerd[1576]: time="2025-12-16T12:22:55.770354255Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:22:55.772451 containerd[1576]: time="2025-12-16T12:22:55.772224589Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:22:55.772451 containerd[1576]: time="2025-12-16T12:22:55.772347390Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:22:55.772613 kubelet[2821]: E1216 12:22:55.772567 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:22:55.773036 kubelet[2821]: E1216 12:22:55.772615 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:22:55.773036 kubelet[2821]: E1216 12:22:55.772741 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d865fff8d-bxz6x_calico-system(bbfa367f-11d3-466c-9181-c6ee23836f5f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:22:55.773036 kubelet[2821]: E1216 12:22:55.772776 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:22:58.444744 kubelet[2821]: E1216 12:22:58.444387 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:22:58.445833 containerd[1576]: time="2025-12-16T12:22:58.444510105Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:22:58.803897 containerd[1576]: time="2025-12-16T12:22:58.803666330Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:22:58.805059 containerd[1576]: time="2025-12-16T12:22:58.804985780Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:22:58.805304 containerd[1576]: time="2025-12-16T12:22:58.805093461Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:22:58.805666 kubelet[2821]: E1216 12:22:58.805596 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:22:58.805821 kubelet[2821]: E1216 12:22:58.805672 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:22:58.805821 kubelet[2821]: E1216 12:22:58.805754 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-85msp_calico-system(6ad46257-20db-42d2-b357-93e753f0c2ca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:22:58.805821 kubelet[2821]: E1216 12:22:58.805788 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-85msp" podUID="6ad46257-20db-42d2-b357-93e753f0c2ca" Dec 16 12:23:01.442712 containerd[1576]: time="2025-12-16T12:23:01.442392215Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:23:01.443268 kubelet[2821]: E1216 12:23:01.442559 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-s9l56" podUID="b4c4df94-c3ee-426c-b06d-ed9edc99469b" Dec 16 12:23:01.794682 containerd[1576]: time="2025-12-16T12:23:01.794430084Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:01.796324 containerd[1576]: time="2025-12-16T12:23:01.796156257Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:23:01.796324 containerd[1576]: time="2025-12-16T12:23:01.796243658Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:23:01.796711 kubelet[2821]: E1216 12:23:01.796461 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:23:01.796711 kubelet[2821]: E1216 12:23:01.796516 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:23:01.797043 kubelet[2821]: E1216 12:23:01.796799 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-85ddfd7b99-lzr5b_calico-apiserver(4dd8654f-30cc-4aed-bf7b-e1600d664a65): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:01.797043 kubelet[2821]: E1216 12:23:01.796842 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85ddfd7b99-lzr5b" podUID="4dd8654f-30cc-4aed-bf7b-e1600d664a65" Dec 16 12:23:01.797280 containerd[1576]: time="2025-12-16T12:23:01.797179026Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:23:02.139581 containerd[1576]: time="2025-12-16T12:23:02.139485355Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:02.141647 containerd[1576]: time="2025-12-16T12:23:02.141514091Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:23:02.141878 containerd[1576]: time="2025-12-16T12:23:02.141575492Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:23:02.141977 kubelet[2821]: E1216 12:23:02.141782 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:23:02.141977 kubelet[2821]: E1216 12:23:02.141830 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:23:02.141977 kubelet[2821]: E1216 12:23:02.141946 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:02.143435 containerd[1576]: time="2025-12-16T12:23:02.143401106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:23:02.483106 containerd[1576]: time="2025-12-16T12:23:02.482947359Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:02.485430 containerd[1576]: time="2025-12-16T12:23:02.485267498Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:23:02.485430 containerd[1576]: time="2025-12-16T12:23:02.485335818Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:23:02.486115 kubelet[2821]: E1216 12:23:02.485990 2821 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:23:02.486115 kubelet[2821]: E1216 12:23:02.486081 2821 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:23:02.487346 kubelet[2821]: E1216 12:23:02.487044 2821 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-6nr2l_calico-system(30279a80-ac32-4c4e-affe-8e2742945896): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:02.487346 kubelet[2821]: E1216 12:23:02.487246 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6nr2l" podUID="30279a80-ac32-4c4e-affe-8e2742945896" Dec 16 12:23:06.926466 kubelet[2821]: E1216 12:23:06.926373 2821 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:34934->10.0.0.2:2379: read: connection timed out" Dec 16 12:23:07.113175 systemd[1]: cri-containerd-a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e.scope: Deactivated successfully. Dec 16 12:23:07.114608 systemd[1]: cri-containerd-a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e.scope: Consumed 5.395s CPU time, 66.2M memory peak, 2.6M read from disk. Dec 16 12:23:07.113000 audit: BPF prog-id=254 op=LOAD Dec 16 12:23:07.116069 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:23:07.116202 kernel: audit: type=1334 audit(1765887787.113:861): prog-id=254 op=LOAD Dec 16 12:23:07.115000 audit: BPF prog-id=91 op=UNLOAD Dec 16 12:23:07.118643 kernel: audit: type=1334 audit(1765887787.115:862): prog-id=91 op=UNLOAD Dec 16 12:23:07.119402 containerd[1576]: time="2025-12-16T12:23:07.119312321Z" level=info msg="received container exit event container_id:\"a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e\" id:\"a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e\" pid:2678 exit_status:1 exited_at:{seconds:1765887787 nanos:117000861}" Dec 16 12:23:07.120464 kernel: audit: type=1334 audit(1765887787.118:863): prog-id=106 op=UNLOAD Dec 16 12:23:07.118000 audit: BPF prog-id=106 op=UNLOAD Dec 16 12:23:07.118000 audit: BPF prog-id=110 op=UNLOAD Dec 16 12:23:07.121643 kernel: audit: type=1334 audit(1765887787.118:864): prog-id=110 op=UNLOAD Dec 16 12:23:07.150365 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e-rootfs.mount: Deactivated successfully. Dec 16 12:23:07.428965 kubelet[2821]: I1216 12:23:07.428907 2821 scope.go:117] "RemoveContainer" containerID="a1dbe4e26788a5151358e7c162a130871a0aeaa00ae87b161311983d6d13ec6e" Dec 16 12:23:07.433651 containerd[1576]: time="2025-12-16T12:23:07.433271356Z" level=info msg="CreateContainer within sandbox \"2dc9032b1843f2e1e8089b687e7f696c4861dee1a05bfce81569647f89ec0454\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 12:23:07.441218 kubelet[2821]: E1216 12:23:07.441107 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d865fff8d-bxz6x" podUID="bbfa367f-11d3-466c-9181-c6ee23836f5f" Dec 16 12:23:07.444498 containerd[1576]: time="2025-12-16T12:23:07.444453573Z" level=info msg="Container c955bd99b08b1309e494fbad792c95d5b1b3d67ce2cfa262c8b0ed0ca9dc50f3: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:07.457760 containerd[1576]: time="2025-12-16T12:23:07.457710247Z" level=info msg="CreateContainer within sandbox \"2dc9032b1843f2e1e8089b687e7f696c4861dee1a05bfce81569647f89ec0454\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"c955bd99b08b1309e494fbad792c95d5b1b3d67ce2cfa262c8b0ed0ca9dc50f3\"" Dec 16 12:23:07.458469 containerd[1576]: time="2025-12-16T12:23:07.458392733Z" level=info msg="StartContainer for \"c955bd99b08b1309e494fbad792c95d5b1b3d67ce2cfa262c8b0ed0ca9dc50f3\"" Dec 16 12:23:07.460084 containerd[1576]: time="2025-12-16T12:23:07.460031548Z" level=info msg="connecting to shim c955bd99b08b1309e494fbad792c95d5b1b3d67ce2cfa262c8b0ed0ca9dc50f3" address="unix:///run/containerd/s/6b1046f13ac1c35f6ecbe65361497e405f3160869526199bba89b8853e52f1a7" protocol=ttrpc version=3 Dec 16 12:23:07.488878 systemd[1]: Started cri-containerd-c955bd99b08b1309e494fbad792c95d5b1b3d67ce2cfa262c8b0ed0ca9dc50f3.scope - libcontainer container c955bd99b08b1309e494fbad792c95d5b1b3d67ce2cfa262c8b0ed0ca9dc50f3. Dec 16 12:23:07.507000 audit: BPF prog-id=255 op=LOAD Dec 16 12:23:07.510668 kernel: audit: type=1334 audit(1765887787.507:865): prog-id=255 op=LOAD Dec 16 12:23:07.510807 kernel: audit: type=1334 audit(1765887787.508:866): prog-id=256 op=LOAD Dec 16 12:23:07.508000 audit: BPF prog-id=256 op=LOAD Dec 16 12:23:07.513651 kernel: audit: type=1300 audit(1765887787.508:866): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000220180 a2=98 a3=0 items=0 ppid=2534 pid=5500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:07.513738 kernel: audit: type=1327 audit(1765887787.508:866): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353562643939623038623133303965343934666261643739326339 Dec 16 12:23:07.508000 audit[5500]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000220180 a2=98 a3=0 items=0 ppid=2534 pid=5500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:07.508000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353562643939623038623133303965343934666261643739326339 Dec 16 12:23:07.509000 audit: BPF prog-id=256 op=UNLOAD Dec 16 12:23:07.509000 audit[5500]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=5500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:07.518853 kernel: audit: type=1334 audit(1765887787.509:867): prog-id=256 op=UNLOAD Dec 16 12:23:07.518920 kernel: audit: type=1300 audit(1765887787.509:867): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=5500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:07.509000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353562643939623038623133303965343934666261643739326339 Dec 16 12:23:07.509000 audit: BPF prog-id=257 op=LOAD Dec 16 12:23:07.509000 audit[5500]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40002203e8 a2=98 a3=0 items=0 ppid=2534 pid=5500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:07.509000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353562643939623038623133303965343934666261643739326339 Dec 16 12:23:07.509000 audit: BPF prog-id=258 op=LOAD Dec 16 12:23:07.509000 audit[5500]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000220168 a2=98 a3=0 items=0 ppid=2534 pid=5500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:07.509000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353562643939623038623133303965343934666261643739326339 Dec 16 12:23:07.509000 audit: BPF prog-id=258 op=UNLOAD Dec 16 12:23:07.509000 audit[5500]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=5500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:07.509000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353562643939623038623133303965343934666261643739326339 Dec 16 12:23:07.509000 audit: BPF prog-id=257 op=UNLOAD Dec 16 12:23:07.509000 audit[5500]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2534 pid=5500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:07.509000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353562643939623038623133303965343934666261643739326339 Dec 16 12:23:07.509000 audit: BPF prog-id=259 op=LOAD Dec 16 12:23:07.509000 audit[5500]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000220648 a2=98 a3=0 items=0 ppid=2534 pid=5500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:07.509000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339353562643939623038623133303965343934666261643739326339 Dec 16 12:23:07.550427 containerd[1576]: time="2025-12-16T12:23:07.550366289Z" level=info msg="StartContainer for \"c955bd99b08b1309e494fbad792c95d5b1b3d67ce2cfa262c8b0ed0ca9dc50f3\" returns successfully" Dec 16 12:23:07.851466 systemd[1]: cri-containerd-9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e.scope: Deactivated successfully. Dec 16 12:23:07.851815 systemd[1]: cri-containerd-9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e.scope: Consumed 42.575s CPU time, 103M memory peak. Dec 16 12:23:07.854000 audit: BPF prog-id=144 op=UNLOAD Dec 16 12:23:07.854000 audit: BPF prog-id=148 op=UNLOAD Dec 16 12:23:07.858666 containerd[1576]: time="2025-12-16T12:23:07.858606514Z" level=info msg="received container exit event container_id:\"9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e\" id:\"9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e\" pid:3147 exit_status:1 exited_at:{seconds:1765887787 nanos:857860708}" Dec 16 12:23:07.900263 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e-rootfs.mount: Deactivated successfully. Dec 16 12:23:08.438700 kubelet[2821]: I1216 12:23:08.438221 2821 scope.go:117] "RemoveContainer" containerID="9a817db1aa96d352701b781365fff58ded038fbd6a6b9599d6b1f8449ff2106e" Dec 16 12:23:08.444660 containerd[1576]: time="2025-12-16T12:23:08.443981187Z" level=info msg="CreateContainer within sandbox \"3d98d97208929ab448becaafe6402441f3e2835fda6df524c238744b2228d46c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 12:23:08.457910 containerd[1576]: time="2025-12-16T12:23:08.457869428Z" level=info msg="Container 8f20ba51a356b58dd4e924ad8ca0bb27005794027bb42bd51bf23e69a3845f21: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:08.469256 containerd[1576]: time="2025-12-16T12:23:08.469213448Z" level=info msg="CreateContainer within sandbox \"3d98d97208929ab448becaafe6402441f3e2835fda6df524c238744b2228d46c\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"8f20ba51a356b58dd4e924ad8ca0bb27005794027bb42bd51bf23e69a3845f21\"" Dec 16 12:23:08.470709 containerd[1576]: time="2025-12-16T12:23:08.470201736Z" level=info msg="StartContainer for \"8f20ba51a356b58dd4e924ad8ca0bb27005794027bb42bd51bf23e69a3845f21\"" Dec 16 12:23:08.472367 containerd[1576]: time="2025-12-16T12:23:08.472325875Z" level=info msg="connecting to shim 8f20ba51a356b58dd4e924ad8ca0bb27005794027bb42bd51bf23e69a3845f21" address="unix:///run/containerd/s/11d457f455a69769cb8e11ebb17de6b024e43111f23e59a6f3e7ffe391a2f391" protocol=ttrpc version=3 Dec 16 12:23:08.509916 systemd[1]: Started cri-containerd-8f20ba51a356b58dd4e924ad8ca0bb27005794027bb42bd51bf23e69a3845f21.scope - libcontainer container 8f20ba51a356b58dd4e924ad8ca0bb27005794027bb42bd51bf23e69a3845f21. Dec 16 12:23:08.526000 audit: BPF prog-id=260 op=LOAD Dec 16 12:23:08.526000 audit: BPF prog-id=261 op=LOAD Dec 16 12:23:08.526000 audit[5542]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c180 a2=98 a3=0 items=0 ppid=2927 pid=5542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:08.526000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866323062613531613335366235386464346539323461643863613062 Dec 16 12:23:08.526000 audit: BPF prog-id=261 op=UNLOAD Dec 16 12:23:08.526000 audit[5542]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2927 pid=5542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:08.526000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866323062613531613335366235386464346539323461643863613062 Dec 16 12:23:08.527000 audit: BPF prog-id=262 op=LOAD Dec 16 12:23:08.527000 audit[5542]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c3e8 a2=98 a3=0 items=0 ppid=2927 pid=5542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:08.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866323062613531613335366235386464346539323461643863613062 Dec 16 12:23:08.527000 audit: BPF prog-id=263 op=LOAD Dec 16 12:23:08.527000 audit[5542]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=400010c168 a2=98 a3=0 items=0 ppid=2927 pid=5542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:08.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866323062613531613335366235386464346539323461643863613062 Dec 16 12:23:08.527000 audit: BPF prog-id=263 op=UNLOAD Dec 16 12:23:08.527000 audit[5542]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2927 pid=5542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:08.527000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866323062613531613335366235386464346539323461643863613062 Dec 16 12:23:08.528000 audit: BPF prog-id=262 op=UNLOAD Dec 16 12:23:08.528000 audit[5542]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2927 pid=5542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:08.528000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866323062613531613335366235386464346539323461643863613062 Dec 16 12:23:08.528000 audit: BPF prog-id=264 op=LOAD Dec 16 12:23:08.528000 audit[5542]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c648 a2=98 a3=0 items=0 ppid=2927 pid=5542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:08.528000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866323062613531613335366235386464346539323461643863613062 Dec 16 12:23:08.556835 containerd[1576]: time="2025-12-16T12:23:08.556788655Z" level=info msg="StartContainer for \"8f20ba51a356b58dd4e924ad8ca0bb27005794027bb42bd51bf23e69a3845f21\" returns successfully" Dec 16 12:23:10.442699 kubelet[2821]: E1216 12:23:10.442613 2821 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-85dd648564-9wttf" podUID="b32c8c20-6807-4c19-8ec5-b6f0be7cc07e" Dec 16 12:23:11.483845 systemd[1]: cri-containerd-73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10.scope: Deactivated successfully. Dec 16 12:23:11.484000 audit: BPF prog-id=265 op=LOAD Dec 16 12:23:11.484000 audit: BPF prog-id=86 op=UNLOAD Dec 16 12:23:11.484515 systemd[1]: cri-containerd-73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10.scope: Consumed 4.078s CPU time, 26.3M memory peak, 3.4M read from disk. Dec 16 12:23:11.488294 containerd[1576]: time="2025-12-16T12:23:11.488170276Z" level=info msg="received container exit event container_id:\"73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10\" id:\"73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10\" pid:2672 exit_status:1 exited_at:{seconds:1765887791 nanos:487584591}" Dec 16 12:23:11.488000 audit: BPF prog-id=101 op=UNLOAD Dec 16 12:23:11.488000 audit: BPF prog-id=105 op=UNLOAD Dec 16 12:23:11.518858 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-73f6c5ed2826018039b35715054aae0a3b2b46b2fd85fee9f6e2500af3506c10-rootfs.mount: Deactivated successfully.