Dec 16 02:07:25.440753 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 16 02:07:25.440778 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Dec 16 00:05:24 -00 2025 Dec 16 02:07:25.440788 kernel: KASLR enabled Dec 16 02:07:25.440795 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Dec 16 02:07:25.440801 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Dec 16 02:07:25.440807 kernel: random: crng init done Dec 16 02:07:25.440814 kernel: secureboot: Secure boot disabled Dec 16 02:07:25.440821 kernel: ACPI: Early table checksum verification disabled Dec 16 02:07:25.440827 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Dec 16 02:07:25.440834 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Dec 16 02:07:25.440841 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 02:07:25.440847 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 02:07:25.440853 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 02:07:25.440859 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 02:07:25.440868 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 02:07:25.440875 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 02:07:25.440882 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 02:07:25.440888 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 02:07:25.440895 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 02:07:25.440901 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 02:07:25.440907 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 16 02:07:25.440914 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 16 02:07:25.440921 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Dec 16 02:07:25.440929 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Dec 16 02:07:25.440935 kernel: Zone ranges: Dec 16 02:07:25.440942 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 16 02:07:25.440948 kernel: DMA32 empty Dec 16 02:07:25.440954 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Dec 16 02:07:25.440961 kernel: Device empty Dec 16 02:07:25.440967 kernel: Movable zone start for each node Dec 16 02:07:25.440974 kernel: Early memory node ranges Dec 16 02:07:25.440980 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Dec 16 02:07:25.440988 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Dec 16 02:07:25.440994 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Dec 16 02:07:25.441001 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Dec 16 02:07:25.441009 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Dec 16 02:07:25.441015 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Dec 16 02:07:25.441021 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Dec 16 02:07:25.441267 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Dec 16 02:07:25.441275 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Dec 16 02:07:25.441288 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Dec 16 02:07:25.441297 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Dec 16 02:07:25.441304 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 16 02:07:25.441311 kernel: psci: probing for conduit method from ACPI. Dec 16 02:07:25.441318 kernel: psci: PSCIv1.1 detected in firmware. Dec 16 02:07:25.441325 kernel: psci: Using standard PSCI v0.2 function IDs Dec 16 02:07:25.441332 kernel: psci: Trusted OS migration not required Dec 16 02:07:25.441339 kernel: psci: SMC Calling Convention v1.1 Dec 16 02:07:25.441347 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 16 02:07:25.441355 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 16 02:07:25.441362 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 16 02:07:25.441369 kernel: pcpu-alloc: [0] 0 [0] 1 Dec 16 02:07:25.441376 kernel: Detected PIPT I-cache on CPU0 Dec 16 02:07:25.441383 kernel: CPU features: detected: GIC system register CPU interface Dec 16 02:07:25.441390 kernel: CPU features: detected: Spectre-v4 Dec 16 02:07:25.441397 kernel: CPU features: detected: Spectre-BHB Dec 16 02:07:25.441404 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 16 02:07:25.441411 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 16 02:07:25.441418 kernel: CPU features: detected: ARM erratum 1418040 Dec 16 02:07:25.441425 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 16 02:07:25.441433 kernel: alternatives: applying boot alternatives Dec 16 02:07:25.441457 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 02:07:25.441468 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Dec 16 02:07:25.441475 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 02:07:25.441482 kernel: Fallback order for Node 0: 0 Dec 16 02:07:25.441489 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Dec 16 02:07:25.441496 kernel: Policy zone: Normal Dec 16 02:07:25.441503 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 02:07:25.441510 kernel: software IO TLB: area num 2. Dec 16 02:07:25.441517 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 16 02:07:25.441526 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 16 02:07:25.441533 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 02:07:25.441541 kernel: rcu: RCU event tracing is enabled. Dec 16 02:07:25.441548 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 16 02:07:25.441556 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 02:07:25.441563 kernel: Tracing variant of Tasks RCU enabled. Dec 16 02:07:25.441570 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 02:07:25.441577 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 16 02:07:25.441584 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 02:07:25.441591 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 02:07:25.441598 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 16 02:07:25.441607 kernel: GICv3: 256 SPIs implemented Dec 16 02:07:25.441614 kernel: GICv3: 0 Extended SPIs implemented Dec 16 02:07:25.441620 kernel: Root IRQ handler: gic_handle_irq Dec 16 02:07:25.441627 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 16 02:07:25.441634 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 16 02:07:25.441641 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 16 02:07:25.441648 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 16 02:07:25.441655 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Dec 16 02:07:25.441662 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Dec 16 02:07:25.441669 kernel: GICv3: using LPI property table @0x0000000100120000 Dec 16 02:07:25.441676 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Dec 16 02:07:25.441684 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 02:07:25.441692 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 02:07:25.441699 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 16 02:07:25.441706 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 16 02:07:25.441713 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 16 02:07:25.441720 kernel: Console: colour dummy device 80x25 Dec 16 02:07:25.441727 kernel: ACPI: Core revision 20240827 Dec 16 02:07:25.441735 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 16 02:07:25.441742 kernel: pid_max: default: 32768 minimum: 301 Dec 16 02:07:25.441751 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 02:07:25.441758 kernel: landlock: Up and running. Dec 16 02:07:25.441766 kernel: SELinux: Initializing. Dec 16 02:07:25.441773 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 16 02:07:25.441780 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 16 02:07:25.441788 kernel: rcu: Hierarchical SRCU implementation. Dec 16 02:07:25.441795 kernel: rcu: Max phase no-delay instances is 400. Dec 16 02:07:25.441803 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 02:07:25.441812 kernel: Remapping and enabling EFI services. Dec 16 02:07:25.441819 kernel: smp: Bringing up secondary CPUs ... Dec 16 02:07:25.441826 kernel: Detected PIPT I-cache on CPU1 Dec 16 02:07:25.441833 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 16 02:07:25.441840 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Dec 16 02:07:25.441848 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 02:07:25.441855 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 16 02:07:25.441863 kernel: smp: Brought up 1 node, 2 CPUs Dec 16 02:07:25.441871 kernel: SMP: Total of 2 processors activated. Dec 16 02:07:25.441884 kernel: CPU: All CPU(s) started at EL1 Dec 16 02:07:25.441893 kernel: CPU features: detected: 32-bit EL0 Support Dec 16 02:07:25.441901 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 16 02:07:25.441908 kernel: CPU features: detected: Common not Private translations Dec 16 02:07:25.441916 kernel: CPU features: detected: CRC32 instructions Dec 16 02:07:25.441923 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 16 02:07:25.441932 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 16 02:07:25.441940 kernel: CPU features: detected: LSE atomic instructions Dec 16 02:07:25.441948 kernel: CPU features: detected: Privileged Access Never Dec 16 02:07:25.441955 kernel: CPU features: detected: RAS Extension Support Dec 16 02:07:25.441963 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 16 02:07:25.441972 kernel: alternatives: applying system-wide alternatives Dec 16 02:07:25.441980 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Dec 16 02:07:25.441988 kernel: Memory: 3885924K/4096000K available (11200K kernel code, 2456K rwdata, 9084K rodata, 12480K init, 1038K bss, 188596K reserved, 16384K cma-reserved) Dec 16 02:07:25.441995 kernel: devtmpfs: initialized Dec 16 02:07:25.442004 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 02:07:25.442011 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 16 02:07:25.442019 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 16 02:07:25.442045 kernel: 0 pages in range for non-PLT usage Dec 16 02:07:25.442055 kernel: 515168 pages in range for PLT usage Dec 16 02:07:25.442063 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 02:07:25.442070 kernel: SMBIOS 3.0.0 present. Dec 16 02:07:25.442078 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Dec 16 02:07:25.442086 kernel: DMI: Memory slots populated: 1/1 Dec 16 02:07:25.442093 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 02:07:25.442101 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Dec 16 02:07:25.442111 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 16 02:07:25.442118 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 16 02:07:25.442126 kernel: audit: initializing netlink subsys (disabled) Dec 16 02:07:25.442134 kernel: audit: type=2000 audit(0.013:1): state=initialized audit_enabled=0 res=1 Dec 16 02:07:25.442141 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 02:07:25.442149 kernel: cpuidle: using governor menu Dec 16 02:07:25.442157 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 16 02:07:25.442166 kernel: ASID allocator initialised with 32768 entries Dec 16 02:07:25.442174 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 02:07:25.442182 kernel: Serial: AMBA PL011 UART driver Dec 16 02:07:25.442190 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 02:07:25.442197 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 02:07:25.442205 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 16 02:07:25.442212 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 16 02:07:25.442221 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 02:07:25.442229 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 02:07:25.442236 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 16 02:07:25.442244 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 16 02:07:25.442252 kernel: ACPI: Added _OSI(Module Device) Dec 16 02:07:25.442259 kernel: ACPI: Added _OSI(Processor Device) Dec 16 02:07:25.442267 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 02:07:25.442275 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 02:07:25.442284 kernel: ACPI: Interpreter enabled Dec 16 02:07:25.442292 kernel: ACPI: Using GIC for interrupt routing Dec 16 02:07:25.442300 kernel: ACPI: MCFG table detected, 1 entries Dec 16 02:07:25.442308 kernel: ACPI: CPU0 has been hot-added Dec 16 02:07:25.442315 kernel: ACPI: CPU1 has been hot-added Dec 16 02:07:25.442323 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 16 02:07:25.442331 kernel: printk: legacy console [ttyAMA0] enabled Dec 16 02:07:25.442341 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 02:07:25.442546 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 02:07:25.442638 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 16 02:07:25.442721 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 16 02:07:25.442803 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 16 02:07:25.442886 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 16 02:07:25.442899 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 16 02:07:25.442907 kernel: PCI host bridge to bus 0000:00 Dec 16 02:07:25.442997 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 16 02:07:25.443094 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 16 02:07:25.443171 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 16 02:07:25.443246 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 02:07:25.443354 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 16 02:07:25.443489 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Dec 16 02:07:25.443597 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Dec 16 02:07:25.443683 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Dec 16 02:07:25.443776 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 02:07:25.443863 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Dec 16 02:07:25.443945 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 02:07:25.444041 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 02:07:25.444131 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 16 02:07:25.444224 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 02:07:25.444309 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Dec 16 02:07:25.444402 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 02:07:25.444508 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 02:07:25.444606 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 02:07:25.444691 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Dec 16 02:07:25.444774 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 02:07:25.444861 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 02:07:25.444942 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 16 02:07:25.445046 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 02:07:25.445133 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Dec 16 02:07:25.445214 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 02:07:25.445297 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 02:07:25.445385 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 16 02:07:25.445485 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 02:07:25.445577 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Dec 16 02:07:25.445660 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 02:07:25.445742 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 02:07:25.445825 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 16 02:07:25.445918 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 02:07:25.446000 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Dec 16 02:07:25.446094 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 02:07:25.446178 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Dec 16 02:07:25.446417 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 16 02:07:25.446568 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 02:07:25.446668 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Dec 16 02:07:25.447780 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 02:07:25.447896 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Dec 16 02:07:25.448885 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Dec 16 02:07:25.449010 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 02:07:25.449121 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Dec 16 02:07:25.449215 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 02:07:25.449297 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Dec 16 02:07:25.449388 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 02:07:25.449493 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Dec 16 02:07:25.449578 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 02:07:25.449663 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 02:07:25.449753 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Dec 16 02:07:25.449835 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Dec 16 02:07:25.449929 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 02:07:25.450015 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Dec 16 02:07:25.452201 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 16 02:07:25.452294 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 02:07:25.452394 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 02:07:25.452526 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Dec 16 02:07:25.452629 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Dec 16 02:07:25.452714 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Dec 16 02:07:25.452806 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 16 02:07:25.452898 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 02:07:25.453564 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 16 02:07:25.453680 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 02:07:25.453766 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Dec 16 02:07:25.453856 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 16 02:07:25.453947 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Dec 16 02:07:25.454164 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Dec 16 02:07:25.454272 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 16 02:07:25.454370 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 02:07:25.454511 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Dec 16 02:07:25.456143 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Dec 16 02:07:25.456237 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 02:07:25.456324 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 16 02:07:25.456406 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 16 02:07:25.456506 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 16 02:07:25.456596 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 16 02:07:25.456686 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 16 02:07:25.456768 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 16 02:07:25.456854 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 02:07:25.456935 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 16 02:07:25.457020 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 16 02:07:25.460286 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 02:07:25.460380 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 16 02:07:25.460515 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 16 02:07:25.460616 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 02:07:25.460715 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 16 02:07:25.460815 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 16 02:07:25.460925 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 02:07:25.461017 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 16 02:07:25.461181 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 16 02:07:25.461277 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 02:07:25.461366 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Dec 16 02:07:25.461463 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Dec 16 02:07:25.461564 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 02:07:25.461649 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 16 02:07:25.461737 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 16 02:07:25.461826 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 02:07:25.461909 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 16 02:07:25.461995 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 16 02:07:25.462124 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 16 02:07:25.462213 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 16 02:07:25.462328 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 16 02:07:25.462413 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 16 02:07:25.462510 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 16 02:07:25.462596 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 16 02:07:25.462679 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 16 02:07:25.462760 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 16 02:07:25.462844 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 16 02:07:25.462925 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 16 02:07:25.463007 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 16 02:07:25.463136 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 16 02:07:25.463225 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 16 02:07:25.463306 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 16 02:07:25.463389 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 16 02:07:25.463688 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 16 02:07:25.463794 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 16 02:07:25.463877 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 16 02:07:25.466182 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Dec 16 02:07:25.466300 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Dec 16 02:07:25.466388 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Dec 16 02:07:25.466492 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 02:07:25.466583 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Dec 16 02:07:25.466674 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Dec 16 02:07:25.466760 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Dec 16 02:07:25.466842 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 02:07:25.466927 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Dec 16 02:07:25.467009 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Dec 16 02:07:25.467127 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Dec 16 02:07:25.467212 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 02:07:25.467297 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Dec 16 02:07:25.467378 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Dec 16 02:07:25.467478 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Dec 16 02:07:25.467565 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Dec 16 02:07:25.467676 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Dec 16 02:07:25.467788 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 02:07:25.467877 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Dec 16 02:07:25.467959 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 02:07:25.468296 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Dec 16 02:07:25.468424 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 16 02:07:25.468567 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 16 02:07:25.468664 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 16 02:07:25.468748 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 02:07:25.468830 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Dec 16 02:07:25.468913 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 02:07:25.468999 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 02:07:25.470269 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 16 02:07:25.472249 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 02:07:25.472338 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Dec 16 02:07:25.472421 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Dec 16 02:07:25.472531 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 02:07:25.472627 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 16 02:07:25.472716 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 16 02:07:25.472801 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 02:07:25.472884 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Dec 16 02:07:25.472971 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Dec 16 02:07:25.473076 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 02:07:25.473171 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 16 02:07:25.473262 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 02:07:25.473344 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Dec 16 02:07:25.473426 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Dec 16 02:07:25.473551 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 02:07:25.473646 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 16 02:07:25.473735 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 16 02:07:25.473824 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 02:07:25.473908 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Dec 16 02:07:25.473989 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 02:07:25.474302 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 02:07:25.474403 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 16 02:07:25.474514 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 16 02:07:25.474609 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 02:07:25.474705 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Dec 16 02:07:25.474787 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 02:07:25.474868 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 02:07:25.474958 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Dec 16 02:07:25.475058 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Dec 16 02:07:25.475159 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Dec 16 02:07:25.475246 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 02:07:25.475331 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Dec 16 02:07:25.475417 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 02:07:25.475520 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 02:07:25.478101 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 02:07:25.478253 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Dec 16 02:07:25.478402 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 02:07:25.478561 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 02:07:25.478665 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 02:07:25.478749 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Dec 16 02:07:25.478836 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 02:07:25.478918 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 02:07:25.479004 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 16 02:07:25.479345 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 16 02:07:25.479469 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 16 02:07:25.479574 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Dec 16 02:07:25.479654 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 16 02:07:25.479733 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 02:07:25.479819 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Dec 16 02:07:25.479900 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 16 02:07:25.479978 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 02:07:25.480082 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Dec 16 02:07:25.480162 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 16 02:07:25.480238 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 02:07:25.480326 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Dec 16 02:07:25.480402 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 16 02:07:25.480492 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 02:07:25.480577 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Dec 16 02:07:25.480653 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 16 02:07:25.480734 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 02:07:25.480827 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Dec 16 02:07:25.480908 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 16 02:07:25.480989 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 02:07:25.481134 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Dec 16 02:07:25.481215 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 16 02:07:25.481291 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 02:07:25.481382 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Dec 16 02:07:25.481507 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 16 02:07:25.481592 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 02:07:25.481678 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Dec 16 02:07:25.484199 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 16 02:07:25.484307 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 02:07:25.484319 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 16 02:07:25.484328 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 16 02:07:25.484337 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 16 02:07:25.484345 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 16 02:07:25.484354 kernel: iommu: Default domain type: Translated Dec 16 02:07:25.484362 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 16 02:07:25.484372 kernel: efivars: Registered efivars operations Dec 16 02:07:25.484391 kernel: vgaarb: loaded Dec 16 02:07:25.484400 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 16 02:07:25.484408 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 02:07:25.484417 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 02:07:25.484425 kernel: pnp: PnP ACPI init Dec 16 02:07:25.484568 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 16 02:07:25.484586 kernel: pnp: PnP ACPI: found 1 devices Dec 16 02:07:25.484594 kernel: NET: Registered PF_INET protocol family Dec 16 02:07:25.484603 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Dec 16 02:07:25.484611 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Dec 16 02:07:25.484620 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 02:07:25.484629 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 02:07:25.484637 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Dec 16 02:07:25.484647 kernel: TCP: Hash tables configured (established 32768 bind 32768) Dec 16 02:07:25.484655 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 16 02:07:25.484663 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 16 02:07:25.484671 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 02:07:25.484767 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 16 02:07:25.484780 kernel: PCI: CLS 0 bytes, default 64 Dec 16 02:07:25.484788 kernel: kvm [1]: HYP mode not available Dec 16 02:07:25.484798 kernel: Initialise system trusted keyrings Dec 16 02:07:25.484807 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Dec 16 02:07:25.484815 kernel: Key type asymmetric registered Dec 16 02:07:25.484823 kernel: Asymmetric key parser 'x509' registered Dec 16 02:07:25.484831 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 16 02:07:25.484841 kernel: io scheduler mq-deadline registered Dec 16 02:07:25.484849 kernel: io scheduler kyber registered Dec 16 02:07:25.484859 kernel: io scheduler bfq registered Dec 16 02:07:25.484868 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 16 02:07:25.484954 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Dec 16 02:07:25.485125 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Dec 16 02:07:25.485218 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 02:07:25.485305 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Dec 16 02:07:25.485407 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Dec 16 02:07:25.485512 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 02:07:25.485602 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Dec 16 02:07:25.485684 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Dec 16 02:07:25.485766 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 02:07:25.485852 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Dec 16 02:07:25.485935 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Dec 16 02:07:25.486020 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 02:07:25.486124 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Dec 16 02:07:25.486209 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Dec 16 02:07:25.486290 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 02:07:25.486375 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Dec 16 02:07:25.486499 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Dec 16 02:07:25.486596 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 02:07:25.486683 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Dec 16 02:07:25.486767 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Dec 16 02:07:25.486849 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 02:07:25.486936 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Dec 16 02:07:25.487018 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Dec 16 02:07:25.487129 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 02:07:25.487146 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 16 02:07:25.487235 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Dec 16 02:07:25.487318 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Dec 16 02:07:25.487399 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 02:07:25.487411 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 16 02:07:25.487420 kernel: ACPI: button: Power Button [PWRB] Dec 16 02:07:25.487430 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 16 02:07:25.487532 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 16 02:07:25.487623 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Dec 16 02:07:25.487635 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 02:07:25.487644 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 16 02:07:25.487729 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Dec 16 02:07:25.487740 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Dec 16 02:07:25.487751 kernel: thunder_xcv, ver 1.0 Dec 16 02:07:25.487760 kernel: thunder_bgx, ver 1.0 Dec 16 02:07:25.487768 kernel: nicpf, ver 1.0 Dec 16 02:07:25.487776 kernel: nicvf, ver 1.0 Dec 16 02:07:25.487873 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 16 02:07:25.487953 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-16T02:07:24 UTC (1765850844) Dec 16 02:07:25.487964 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 02:07:25.487975 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 16 02:07:25.487983 kernel: watchdog: NMI not fully supported Dec 16 02:07:25.487992 kernel: watchdog: Hard watchdog permanently disabled Dec 16 02:07:25.488000 kernel: NET: Registered PF_INET6 protocol family Dec 16 02:07:25.488008 kernel: Segment Routing with IPv6 Dec 16 02:07:25.488016 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 02:07:25.488035 kernel: NET: Registered PF_PACKET protocol family Dec 16 02:07:25.488046 kernel: Key type dns_resolver registered Dec 16 02:07:25.488055 kernel: registered taskstats version 1 Dec 16 02:07:25.488063 kernel: Loading compiled-in X.509 certificates Dec 16 02:07:25.488071 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 545838337a91b65b763486e536766b3eec3ef99d' Dec 16 02:07:25.488079 kernel: Demotion targets for Node 0: null Dec 16 02:07:25.488088 kernel: Key type .fscrypt registered Dec 16 02:07:25.488096 kernel: Key type fscrypt-provisioning registered Dec 16 02:07:25.488105 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 02:07:25.488113 kernel: ima: Allocated hash algorithm: sha1 Dec 16 02:07:25.488121 kernel: ima: No architecture policies found Dec 16 02:07:25.488130 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 16 02:07:25.488138 kernel: clk: Disabling unused clocks Dec 16 02:07:25.488147 kernel: PM: genpd: Disabling unused power domains Dec 16 02:07:25.488155 kernel: Freeing unused kernel memory: 12480K Dec 16 02:07:25.488165 kernel: Run /init as init process Dec 16 02:07:25.488173 kernel: with arguments: Dec 16 02:07:25.488182 kernel: /init Dec 16 02:07:25.488190 kernel: with environment: Dec 16 02:07:25.488197 kernel: HOME=/ Dec 16 02:07:25.488205 kernel: TERM=linux Dec 16 02:07:25.488213 kernel: ACPI: bus type USB registered Dec 16 02:07:25.488221 kernel: usbcore: registered new interface driver usbfs Dec 16 02:07:25.488231 kernel: usbcore: registered new interface driver hub Dec 16 02:07:25.488239 kernel: usbcore: registered new device driver usb Dec 16 02:07:25.488332 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 02:07:25.488417 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 02:07:25.488540 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 02:07:25.488629 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 02:07:25.488716 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 02:07:25.488800 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 02:07:25.488916 kernel: hub 1-0:1.0: USB hub found Dec 16 02:07:25.489006 kernel: hub 1-0:1.0: 4 ports detected Dec 16 02:07:25.489131 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 02:07:25.489239 kernel: hub 2-0:1.0: USB hub found Dec 16 02:07:25.489330 kernel: hub 2-0:1.0: 4 ports detected Dec 16 02:07:25.489341 kernel: SCSI subsystem initialized Dec 16 02:07:25.489438 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Dec 16 02:07:25.489562 kernel: scsi host0: Virtio SCSI HBA Dec 16 02:07:25.489756 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 16 02:07:25.489881 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Dec 16 02:07:25.489976 kernel: sr 0:0:0:0: Power-on or device reset occurred Dec 16 02:07:25.491678 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Dec 16 02:07:25.491702 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 16 02:07:25.491800 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Dec 16 02:07:25.491904 kernel: sd 0:0:0:1: Power-on or device reset occurred Dec 16 02:07:25.491997 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Dec 16 02:07:25.492116 kernel: sd 0:0:0:1: [sda] Write Protect is off Dec 16 02:07:25.492207 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Dec 16 02:07:25.492296 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Dec 16 02:07:25.492307 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 02:07:25.492318 kernel: GPT:25804799 != 80003071 Dec 16 02:07:25.492327 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 02:07:25.492335 kernel: GPT:25804799 != 80003071 Dec 16 02:07:25.492343 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 02:07:25.492351 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 16 02:07:25.492439 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Dec 16 02:07:25.492460 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 02:07:25.492472 kernel: device-mapper: uevent: version 1.0.3 Dec 16 02:07:25.492480 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 02:07:25.492488 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 16 02:07:25.492497 kernel: raid6: neonx8 gen() 15249 MB/s Dec 16 02:07:25.492505 kernel: raid6: neonx4 gen() 13536 MB/s Dec 16 02:07:25.492513 kernel: raid6: neonx2 gen() 13119 MB/s Dec 16 02:07:25.492521 kernel: raid6: neonx1 gen() 10303 MB/s Dec 16 02:07:25.492530 kernel: raid6: int64x8 gen() 6717 MB/s Dec 16 02:07:25.492539 kernel: raid6: int64x4 gen() 7265 MB/s Dec 16 02:07:25.492588 kernel: raid6: int64x2 gen() 6011 MB/s Dec 16 02:07:25.492600 kernel: raid6: int64x1 gen() 4747 MB/s Dec 16 02:07:25.492739 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 02:07:25.492753 kernel: raid6: using algorithm neonx8 gen() 15249 MB/s Dec 16 02:07:25.492762 kernel: raid6: .... xor() 11923 MB/s, rmw enabled Dec 16 02:07:25.492774 kernel: raid6: using neon recovery algorithm Dec 16 02:07:25.492782 kernel: xor: measuring software checksum speed Dec 16 02:07:25.492791 kernel: 8regs : 21641 MB/sec Dec 16 02:07:25.492799 kernel: 32regs : 21704 MB/sec Dec 16 02:07:25.492807 kernel: arm64_neon : 26691 MB/sec Dec 16 02:07:25.492815 kernel: xor: using function: arm64_neon (26691 MB/sec) Dec 16 02:07:25.492823 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 02:07:25.492833 kernel: BTRFS: device fsid d00a2bc5-1c68-4957-aa37-d070193fcf05 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (212) Dec 16 02:07:25.492841 kernel: BTRFS info (device dm-0): first mount of filesystem d00a2bc5-1c68-4957-aa37-d070193fcf05 Dec 16 02:07:25.492850 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 16 02:07:25.492858 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 16 02:07:25.492866 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 02:07:25.492875 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 02:07:25.492883 kernel: loop: module loaded Dec 16 02:07:25.492892 kernel: loop0: detected capacity change from 0 to 91832 Dec 16 02:07:25.492901 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 02:07:25.493005 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 16 02:07:25.493018 systemd[1]: Successfully made /usr/ read-only. Dec 16 02:07:25.493050 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 02:07:25.493063 systemd[1]: Detected virtualization kvm. Dec 16 02:07:25.493071 systemd[1]: Detected architecture arm64. Dec 16 02:07:25.493079 systemd[1]: Running in initrd. Dec 16 02:07:25.493088 systemd[1]: No hostname configured, using default hostname. Dec 16 02:07:25.493097 systemd[1]: Hostname set to . Dec 16 02:07:25.493105 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 02:07:25.493113 systemd[1]: Queued start job for default target initrd.target. Dec 16 02:07:25.493124 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 02:07:25.493133 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 02:07:25.493141 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 02:07:25.493151 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 02:07:25.493159 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 02:07:25.493169 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 02:07:25.493179 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 02:07:25.493188 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 02:07:25.493197 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 02:07:25.493205 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 02:07:25.493214 systemd[1]: Reached target paths.target - Path Units. Dec 16 02:07:25.493223 systemd[1]: Reached target slices.target - Slice Units. Dec 16 02:07:25.493233 systemd[1]: Reached target swap.target - Swaps. Dec 16 02:07:25.493242 systemd[1]: Reached target timers.target - Timer Units. Dec 16 02:07:25.493250 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 02:07:25.493259 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 02:07:25.493268 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 02:07:25.493277 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 02:07:25.493286 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 02:07:25.493296 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 02:07:25.493305 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 02:07:25.493315 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 02:07:25.493323 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 02:07:25.493332 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 02:07:25.493341 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 02:07:25.493350 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 02:07:25.493360 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 02:07:25.493369 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 02:07:25.493378 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 02:07:25.493387 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 02:07:25.493396 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 02:07:25.493407 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 02:07:25.493416 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 02:07:25.493424 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 02:07:25.493434 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 02:07:25.493479 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 02:07:25.493522 systemd-journald[350]: Collecting audit messages is enabled. Dec 16 02:07:25.493545 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 02:07:25.493554 kernel: Bridge firewalling registered Dec 16 02:07:25.493565 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 02:07:25.493574 kernel: audit: type=1130 audit(1765850845.470:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.493583 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 02:07:25.493592 kernel: audit: type=1130 audit(1765850845.473:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.493601 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 02:07:25.493610 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 02:07:25.493620 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 02:07:25.493630 kernel: audit: type=1130 audit(1765850845.488:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.493640 systemd-journald[350]: Journal started Dec 16 02:07:25.493659 systemd-journald[350]: Runtime Journal (/run/log/journal/a108b4a0f6d04f76852245c450befaaf) is 8M, max 76.5M, 68.5M free. Dec 16 02:07:25.470000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.473000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.488000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.468138 systemd-modules-load[352]: Inserted module 'br_netfilter' Dec 16 02:07:25.498050 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 02:07:25.498087 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 02:07:25.498100 kernel: audit: type=1130 audit(1765850845.497:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.497000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.502896 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 02:07:25.515813 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 02:07:25.516000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.521062 kernel: audit: type=1130 audit(1765850845.516:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.521195 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 02:07:25.522000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.525261 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 02:07:25.527000 audit: BPF prog-id=6 op=LOAD Dec 16 02:07:25.530071 kernel: audit: type=1130 audit(1765850845.522:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.531082 kernel: audit: type=1334 audit(1765850845.527:8): prog-id=6 op=LOAD Dec 16 02:07:25.530350 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 02:07:25.534068 systemd-tmpfiles[371]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 02:07:25.537129 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 02:07:25.537000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.541070 kernel: audit: type=1130 audit(1765850845.537:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.543717 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 02:07:25.544000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.549048 kernel: audit: type=1130 audit(1765850845.544:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.559176 dracut-cmdline[384]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 02:07:25.591217 systemd-resolved[385]: Positive Trust Anchors: Dec 16 02:07:25.591235 systemd-resolved[385]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 02:07:25.591239 systemd-resolved[385]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 02:07:25.591270 systemd-resolved[385]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 02:07:25.623885 systemd-resolved[385]: Defaulting to hostname 'linux'. Dec 16 02:07:25.626853 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 02:07:25.628000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.628813 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 02:07:25.686117 kernel: Loading iSCSI transport class v2.0-870. Dec 16 02:07:25.699071 kernel: iscsi: registered transport (tcp) Dec 16 02:07:25.713080 kernel: iscsi: registered transport (qla4xxx) Dec 16 02:07:25.713154 kernel: QLogic iSCSI HBA Driver Dec 16 02:07:25.740804 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 02:07:25.763201 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 02:07:25.764000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.766322 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 02:07:25.827871 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 02:07:25.828000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.830740 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 02:07:25.832135 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 02:07:25.892117 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 02:07:25.892000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.893000 audit: BPF prog-id=7 op=LOAD Dec 16 02:07:25.893000 audit: BPF prog-id=8 op=LOAD Dec 16 02:07:25.894917 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 02:07:25.933399 systemd-udevd[628]: Using default interface naming scheme 'v257'. Dec 16 02:07:25.944000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:25.943584 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 02:07:25.949737 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 02:07:25.985901 dracut-pre-trigger[684]: rd.md=0: removing MD RAID activation Dec 16 02:07:26.003000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:26.002782 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 02:07:26.007000 audit: BPF prog-id=9 op=LOAD Dec 16 02:07:26.009427 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 02:07:26.042118 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 02:07:26.043000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:26.045404 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 02:07:26.063706 systemd-networkd[746]: lo: Link UP Dec 16 02:07:26.063714 systemd-networkd[746]: lo: Gained carrier Dec 16 02:07:26.067000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:26.067178 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 02:07:26.067954 systemd[1]: Reached target network.target - Network. Dec 16 02:07:26.129877 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 02:07:26.132000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:26.135575 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 02:07:26.252881 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Dec 16 02:07:26.266683 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Dec 16 02:07:26.288406 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Dec 16 02:07:26.299694 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 02:07:26.302525 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 02:07:26.327798 disk-uuid[804]: Primary Header is updated. Dec 16 02:07:26.327798 disk-uuid[804]: Secondary Entries is updated. Dec 16 02:07:26.327798 disk-uuid[804]: Secondary Header is updated. Dec 16 02:07:26.355384 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 16 02:07:26.355469 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 02:07:26.370052 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 16 02:07:26.386916 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 02:07:26.387920 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 02:07:26.389000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:26.388597 systemd-networkd[746]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 02:07:26.388601 systemd-networkd[746]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 02:07:26.389015 systemd-networkd[746]: eth0: Link UP Dec 16 02:07:26.390128 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 02:07:26.390194 systemd-networkd[746]: eth0: Gained carrier Dec 16 02:07:26.390207 systemd-networkd[746]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 02:07:26.395090 systemd-networkd[746]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 02:07:26.395094 systemd-networkd[746]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 02:07:26.396084 systemd-networkd[746]: eth1: Link UP Dec 16 02:07:26.396873 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 02:07:26.396907 systemd-networkd[746]: eth1: Gained carrier Dec 16 02:07:26.396922 systemd-networkd[746]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 02:07:26.439445 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 16 02:07:26.439692 kernel: usbcore: registered new interface driver usbhid Dec 16 02:07:26.440233 systemd-networkd[746]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 02:07:26.445134 systemd-networkd[746]: eth0: DHCPv4 address 49.13.61.135/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 02:07:26.448260 kernel: usbhid: USB HID core driver Dec 16 02:07:26.453000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:26.453159 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 02:07:26.532107 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 02:07:26.533153 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 02:07:26.532000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:26.534546 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 02:07:26.536561 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 02:07:26.538872 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 02:07:26.570816 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 02:07:26.571000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:27.371116 disk-uuid[805]: Warning: The kernel is still using the old partition table. Dec 16 02:07:27.371116 disk-uuid[805]: The new table will be used at the next reboot or after you Dec 16 02:07:27.371116 disk-uuid[805]: run partprobe(8) or kpartx(8) Dec 16 02:07:27.371116 disk-uuid[805]: The operation has completed successfully. Dec 16 02:07:27.380952 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 02:07:27.381138 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 02:07:27.382000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:27.382000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:27.384960 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 02:07:27.424079 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (841) Dec 16 02:07:27.426090 kernel: BTRFS info (device sda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 02:07:27.426163 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 02:07:27.430150 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 02:07:27.430222 kernel: BTRFS info (device sda6): turning on async discard Dec 16 02:07:27.430258 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 02:07:27.438127 kernel: BTRFS info (device sda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 02:07:27.438690 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 02:07:27.439000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:27.440599 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 02:07:27.555109 systemd-networkd[746]: eth0: Gained IPv6LL Dec 16 02:07:27.591673 ignition[860]: Ignition 2.24.0 Dec 16 02:07:27.591691 ignition[860]: Stage: fetch-offline Dec 16 02:07:27.594902 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 02:07:27.595000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:27.591737 ignition[860]: no configs at "/usr/lib/ignition/base.d" Dec 16 02:07:27.596815 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 02:07:27.591746 ignition[860]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 02:07:27.591899 ignition[860]: parsed url from cmdline: "" Dec 16 02:07:27.591902 ignition[860]: no config URL provided Dec 16 02:07:27.591906 ignition[860]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 02:07:27.591915 ignition[860]: no config at "/usr/lib/ignition/user.ign" Dec 16 02:07:27.591920 ignition[860]: failed to fetch config: resource requires networking Dec 16 02:07:27.592204 ignition[860]: Ignition finished successfully Dec 16 02:07:27.624599 ignition[868]: Ignition 2.24.0 Dec 16 02:07:27.624614 ignition[868]: Stage: fetch Dec 16 02:07:27.624765 ignition[868]: no configs at "/usr/lib/ignition/base.d" Dec 16 02:07:27.624773 ignition[868]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 02:07:27.624859 ignition[868]: parsed url from cmdline: "" Dec 16 02:07:27.624862 ignition[868]: no config URL provided Dec 16 02:07:27.624869 ignition[868]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 02:07:27.624875 ignition[868]: no config at "/usr/lib/ignition/user.ign" Dec 16 02:07:27.624904 ignition[868]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Dec 16 02:07:27.631279 ignition[868]: GET result: OK Dec 16 02:07:27.631367 ignition[868]: parsing config with SHA512: ecb1af768193099dccff7d520a7be415314d150c818f9042726ba64ce875edd37691261b46122ad7f1374f1ce021d693ce50a0001c19acdade0374110f5f4429 Dec 16 02:07:27.639829 unknown[868]: fetched base config from "system" Dec 16 02:07:27.639842 unknown[868]: fetched base config from "system" Dec 16 02:07:27.641001 ignition[868]: fetch: fetch complete Dec 16 02:07:27.639851 unknown[868]: fetched user config from "hetzner" Dec 16 02:07:27.641007 ignition[868]: fetch: fetch passed Dec 16 02:07:27.641085 ignition[868]: Ignition finished successfully Dec 16 02:07:27.643348 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 02:07:27.644000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:27.647188 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 02:07:27.671305 ignition[874]: Ignition 2.24.0 Dec 16 02:07:27.671969 ignition[874]: Stage: kargs Dec 16 02:07:27.672506 ignition[874]: no configs at "/usr/lib/ignition/base.d" Dec 16 02:07:27.672518 ignition[874]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 02:07:27.673348 ignition[874]: kargs: kargs passed Dec 16 02:07:27.673397 ignition[874]: Ignition finished successfully Dec 16 02:07:27.677666 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 02:07:27.679000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:27.681194 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 02:07:27.706831 ignition[881]: Ignition 2.24.0 Dec 16 02:07:27.706844 ignition[881]: Stage: disks Dec 16 02:07:27.707009 ignition[881]: no configs at "/usr/lib/ignition/base.d" Dec 16 02:07:27.707017 ignition[881]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 02:07:27.708107 ignition[881]: disks: disks passed Dec 16 02:07:27.708161 ignition[881]: Ignition finished successfully Dec 16 02:07:27.710000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:27.710139 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 02:07:27.711254 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 02:07:27.712157 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 02:07:27.713363 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 02:07:27.714628 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 02:07:27.715208 systemd[1]: Reached target basic.target - Basic System. Dec 16 02:07:27.717296 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 02:07:27.772626 systemd-fsck[890]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Dec 16 02:07:27.776290 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 02:07:27.778000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:27.780169 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 02:07:27.866052 kernel: EXT4-fs (sda9): mounted filesystem 0e69f709-36a9-4e15-b0c9-c7e150185653 r/w with ordered data mode. Quota mode: none. Dec 16 02:07:27.866774 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 02:07:27.868549 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 02:07:27.872396 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 02:07:27.874865 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 02:07:27.881197 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 16 02:07:27.881858 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 02:07:27.881898 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 02:07:27.891099 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 02:07:27.892591 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 02:07:27.903562 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (898) Dec 16 02:07:27.907267 kernel: BTRFS info (device sda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 02:07:27.907322 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 02:07:27.918546 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 02:07:27.918606 kernel: BTRFS info (device sda6): turning on async discard Dec 16 02:07:27.919520 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 02:07:27.926107 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 02:07:27.963898 coreos-metadata[900]: Dec 16 02:07:27.963 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Dec 16 02:07:27.966791 coreos-metadata[900]: Dec 16 02:07:27.966 INFO Fetch successful Dec 16 02:07:27.966791 coreos-metadata[900]: Dec 16 02:07:27.966 INFO wrote hostname ci-4547-0-0-9-be0981937a to /sysroot/etc/hostname Dec 16 02:07:27.972159 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 02:07:27.972000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:28.073435 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 02:07:28.074000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:28.077110 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 02:07:28.079992 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 02:07:28.094611 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 02:07:28.097048 kernel: BTRFS info (device sda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 02:07:28.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:28.119068 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 02:07:28.127186 systemd-networkd[746]: eth1: Gained IPv6LL Dec 16 02:07:28.131123 ignition[999]: INFO : Ignition 2.24.0 Dec 16 02:07:28.132955 ignition[999]: INFO : Stage: mount Dec 16 02:07:28.132955 ignition[999]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 02:07:28.132955 ignition[999]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 02:07:28.132955 ignition[999]: INFO : mount: mount passed Dec 16 02:07:28.132955 ignition[999]: INFO : Ignition finished successfully Dec 16 02:07:28.135109 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 02:07:28.136000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:28.139201 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 02:07:28.870506 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 02:07:28.904078 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1010) Dec 16 02:07:28.905722 kernel: BTRFS info (device sda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 02:07:28.905793 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 02:07:28.910160 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 02:07:28.910228 kernel: BTRFS info (device sda6): turning on async discard Dec 16 02:07:28.910244 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 02:07:28.913339 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 02:07:28.943548 ignition[1028]: INFO : Ignition 2.24.0 Dec 16 02:07:28.945360 ignition[1028]: INFO : Stage: files Dec 16 02:07:28.945360 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 02:07:28.945360 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 02:07:28.945360 ignition[1028]: DEBUG : files: compiled without relabeling support, skipping Dec 16 02:07:28.949712 ignition[1028]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 02:07:28.949712 ignition[1028]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 02:07:28.953930 ignition[1028]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 02:07:28.953930 ignition[1028]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 02:07:28.953930 ignition[1028]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 02:07:28.952116 unknown[1028]: wrote ssh authorized keys file for user: core Dec 16 02:07:28.958629 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 02:07:28.958629 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 16 02:07:29.024712 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 02:07:29.097623 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 02:07:29.098933 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 02:07:29.100335 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 02:07:29.100335 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 02:07:29.100335 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 02:07:29.100335 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 02:07:29.100335 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 02:07:29.100335 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 02:07:29.100335 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 02:07:29.109468 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 02:07:29.109468 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 02:07:29.109468 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 02:07:29.109468 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 02:07:29.109468 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 02:07:29.109468 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Dec 16 02:07:29.432409 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 02:07:29.986329 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 02:07:29.989471 ignition[1028]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 02:07:29.989471 ignition[1028]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 02:07:29.995555 ignition[1028]: INFO : files: files passed Dec 16 02:07:29.995555 ignition[1028]: INFO : Ignition finished successfully Dec 16 02:07:30.011013 kernel: kauditd_printk_skb: 28 callbacks suppressed Dec 16 02:07:30.011053 kernel: audit: type=1130 audit(1765850849.998:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:29.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:29.995339 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 02:07:30.000736 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 02:07:30.005276 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 02:07:30.026754 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 02:07:30.033879 kernel: audit: type=1130 audit(1765850850.028:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.033912 kernel: audit: type=1131 audit(1765850850.031:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.028000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.031000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.026864 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 02:07:30.039009 initrd-setup-root-after-ignition[1058]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 02:07:30.039009 initrd-setup-root-after-ignition[1058]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 02:07:30.041609 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 02:07:30.046145 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 02:07:30.047000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.047374 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 02:07:30.052678 kernel: audit: type=1130 audit(1765850850.047:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.053211 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 02:07:30.135016 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 02:07:30.135212 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 02:07:30.136000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.139764 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 02:07:30.143508 kernel: audit: type=1130 audit(1765850850.136:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.143537 kernel: audit: type=1131 audit(1765850850.136:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.136000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.140636 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 02:07:30.143132 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 02:07:30.144164 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 02:07:30.167890 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 02:07:30.168000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.174040 kernel: audit: type=1130 audit(1765850850.168:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.173276 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 02:07:30.199975 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 02:07:30.201042 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 02:07:30.201775 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 02:07:30.203286 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 02:07:30.204553 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 02:07:30.204808 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 02:07:30.206000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.207640 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 02:07:30.210501 kernel: audit: type=1131 audit(1765850850.206:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.210727 systemd[1]: Stopped target basic.target - Basic System. Dec 16 02:07:30.212355 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 02:07:30.214666 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 02:07:30.216263 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 02:07:30.217768 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 02:07:30.219220 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 02:07:30.220613 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 02:07:30.221793 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 02:07:30.222963 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 02:07:30.223988 systemd[1]: Stopped target swap.target - Swaps. Dec 16 02:07:30.224975 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 02:07:30.225133 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 02:07:30.225000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.226510 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 02:07:30.230210 kernel: audit: type=1131 audit(1765850850.225:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.228494 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 02:07:30.229675 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 02:07:30.229759 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 02:07:30.235111 kernel: audit: type=1131 audit(1765850850.232:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.232000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.230847 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 02:07:30.235000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.230969 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 02:07:30.236000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.234318 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 02:07:30.238000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.234472 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 02:07:30.235820 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 02:07:30.235911 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 02:07:30.236833 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 16 02:07:30.236936 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 02:07:30.240084 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 02:07:30.244280 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 02:07:30.244821 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 02:07:30.246000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.247000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.246183 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 02:07:30.246987 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 02:07:30.250000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.247113 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 02:07:30.247863 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 02:07:30.247960 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 02:07:30.256863 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 02:07:30.259000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.259000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.259283 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 02:07:30.272448 ignition[1082]: INFO : Ignition 2.24.0 Dec 16 02:07:30.272448 ignition[1082]: INFO : Stage: umount Dec 16 02:07:30.272448 ignition[1082]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 02:07:30.272448 ignition[1082]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 02:07:30.272448 ignition[1082]: INFO : umount: umount passed Dec 16 02:07:30.272448 ignition[1082]: INFO : Ignition finished successfully Dec 16 02:07:30.281000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.283000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.272068 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 02:07:30.279539 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 02:07:30.292000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.279722 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 02:07:30.295000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.282081 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 02:07:30.282193 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 02:07:30.303000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.283872 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 02:07:30.283929 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 02:07:30.292542 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 02:07:30.292652 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 02:07:30.296134 systemd[1]: Stopped target network.target - Network. Dec 16 02:07:30.299038 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 02:07:30.299138 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 02:07:30.306122 systemd[1]: Stopped target paths.target - Path Units. Dec 16 02:07:30.307270 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 02:07:30.309379 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 02:07:30.310736 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 02:07:30.311773 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 02:07:30.312665 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 02:07:30.316000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.312709 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 02:07:30.317000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.313619 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 02:07:30.313653 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 02:07:30.314814 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Dec 16 02:07:30.314838 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Dec 16 02:07:30.315725 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 02:07:30.315780 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 02:07:30.316621 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 02:07:30.316660 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 02:07:30.324000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.317825 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 02:07:30.327000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.318807 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 02:07:30.323627 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 02:07:30.323752 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 02:07:30.325466 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 02:07:30.332000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.325578 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 02:07:30.332000 audit: BPF prog-id=6 op=UNLOAD Dec 16 02:07:30.330082 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 02:07:30.330200 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 02:07:30.335000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.334150 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 02:07:30.334245 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 02:07:30.338000 audit: BPF prog-id=9 op=UNLOAD Dec 16 02:07:30.338732 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 02:07:30.339880 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 02:07:30.339939 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 02:07:30.342161 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 02:07:30.344118 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 02:07:30.344198 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 02:07:30.346000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.346000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.346247 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 02:07:30.346299 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 02:07:30.346892 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 02:07:30.350000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.346928 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 02:07:30.350895 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 02:07:30.365362 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 02:07:30.367068 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 02:07:30.367000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.369748 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 02:07:30.369838 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 02:07:30.372208 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 02:07:30.372264 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 02:07:30.375000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.374158 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 02:07:30.374216 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 02:07:30.377000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.376481 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 02:07:30.379000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.376539 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 02:07:30.378415 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 02:07:30.378466 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 02:07:30.383015 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 02:07:30.384504 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 02:07:30.385289 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 02:07:30.387000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.387180 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 02:07:30.387864 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 02:07:30.389000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.389794 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 02:07:30.391000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.390541 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 02:07:30.391389 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 02:07:30.392000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.393000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.391453 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 02:07:30.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.392263 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 02:07:30.392313 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 02:07:30.394557 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 02:07:30.394703 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 02:07:30.404954 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 02:07:30.405181 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 02:07:30.406000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.406000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:30.407470 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 02:07:30.410522 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 02:07:30.435021 systemd[1]: Switching root. Dec 16 02:07:30.480309 systemd-journald[350]: Journal stopped Dec 16 02:07:31.427346 systemd-journald[350]: Received SIGTERM from PID 1 (systemd). Dec 16 02:07:31.427420 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 02:07:31.427436 kernel: SELinux: policy capability open_perms=1 Dec 16 02:07:31.427447 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 02:07:31.427462 kernel: SELinux: policy capability always_check_network=0 Dec 16 02:07:31.427476 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 02:07:31.427486 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 02:07:31.427499 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 02:07:31.427511 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 02:07:31.427524 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 02:07:31.427534 systemd[1]: Successfully loaded SELinux policy in 53.156ms. Dec 16 02:07:31.427553 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.416ms. Dec 16 02:07:31.427568 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 02:07:31.427580 systemd[1]: Detected virtualization kvm. Dec 16 02:07:31.427592 systemd[1]: Detected architecture arm64. Dec 16 02:07:31.427603 systemd[1]: Detected first boot. Dec 16 02:07:31.427613 systemd[1]: Hostname set to . Dec 16 02:07:31.427625 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 02:07:31.427636 zram_generator::config[1125]: No configuration found. Dec 16 02:07:31.427650 kernel: NET: Registered PF_VSOCK protocol family Dec 16 02:07:31.427664 systemd[1]: Populated /etc with preset unit settings. Dec 16 02:07:31.427675 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 02:07:31.427685 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 02:07:31.427696 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 02:07:31.427709 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 02:07:31.427721 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 02:07:31.427734 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 02:07:31.427745 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 02:07:31.427756 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 02:07:31.427767 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 02:07:31.427777 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 02:07:31.427792 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 02:07:31.427803 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 02:07:31.427814 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 02:07:31.427825 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 02:07:31.427836 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 02:07:31.427847 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 02:07:31.427858 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 02:07:31.428819 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 16 02:07:31.428839 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 02:07:31.428851 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 02:07:31.428863 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 02:07:31.428890 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 02:07:31.428907 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 02:07:31.428918 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 02:07:31.428929 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 02:07:31.428941 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 02:07:31.428951 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Dec 16 02:07:31.428963 systemd[1]: Reached target slices.target - Slice Units. Dec 16 02:07:31.428973 systemd[1]: Reached target swap.target - Swaps. Dec 16 02:07:31.428985 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 02:07:31.428996 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 02:07:31.429008 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 02:07:31.429019 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 02:07:31.429056 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Dec 16 02:07:31.429070 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 02:07:31.434738 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Dec 16 02:07:31.434771 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Dec 16 02:07:31.434783 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 02:07:31.434794 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 02:07:31.434806 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 02:07:31.434817 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 02:07:31.434828 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 02:07:31.434839 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 02:07:31.434852 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 02:07:31.434862 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 02:07:31.434874 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 02:07:31.434886 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 02:07:31.434897 systemd[1]: Reached target machines.target - Containers. Dec 16 02:07:31.434908 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 02:07:31.434919 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 02:07:31.434932 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 02:07:31.434943 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 02:07:31.434954 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 02:07:31.434966 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 02:07:31.434978 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 02:07:31.434988 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 02:07:31.435000 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 02:07:31.435013 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 02:07:31.435046 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 02:07:31.435075 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 02:07:31.435090 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 02:07:31.435446 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 02:07:31.435464 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 02:07:31.435476 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 02:07:31.435487 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 02:07:31.435503 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 02:07:31.435516 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 02:07:31.435529 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 02:07:31.435541 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 02:07:31.435552 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 02:07:31.435564 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 02:07:31.435574 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 02:07:31.435585 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 02:07:31.435598 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 02:07:31.435610 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 02:07:31.435622 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 02:07:31.435635 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 02:07:31.435646 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 02:07:31.435658 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 02:07:31.435670 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 02:07:31.435680 kernel: fuse: init (API version 7.41) Dec 16 02:07:31.435692 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 02:07:31.435703 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 02:07:31.435714 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 02:07:31.435724 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 02:07:31.435736 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 02:07:31.435747 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 02:07:31.435758 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 02:07:31.435769 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 02:07:31.435781 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 02:07:31.435792 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 02:07:31.435803 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 02:07:31.435815 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 02:07:31.435826 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 02:07:31.435867 systemd-journald[1189]: Collecting audit messages is enabled. Dec 16 02:07:31.435894 kernel: ACPI: bus type drm_connector registered Dec 16 02:07:31.435906 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 02:07:31.435919 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 02:07:31.435932 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 02:07:31.435944 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 02:07:31.435956 systemd-journald[1189]: Journal started Dec 16 02:07:31.435978 systemd-journald[1189]: Runtime Journal (/run/log/journal/a108b4a0f6d04f76852245c450befaaf) is 8M, max 76.5M, 68.5M free. Dec 16 02:07:31.184000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Dec 16 02:07:31.440426 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 02:07:31.440460 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Dec 16 02:07:31.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.303000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.306000 audit: BPF prog-id=14 op=UNLOAD Dec 16 02:07:31.447967 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 02:07:31.448038 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 02:07:31.448062 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 02:07:31.306000 audit: BPF prog-id=13 op=UNLOAD Dec 16 02:07:31.307000 audit: BPF prog-id=15 op=LOAD Dec 16 02:07:31.310000 audit: BPF prog-id=16 op=LOAD Dec 16 02:07:31.311000 audit: BPF prog-id=17 op=LOAD Dec 16 02:07:31.366000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.370000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.370000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.373000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.373000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.376000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.376000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.379000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.379000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.450898 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 02:07:31.450948 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 02:07:31.383000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.418000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.418000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.421000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.424000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Dec 16 02:07:31.424000 audit[1189]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=6 a1=ffffe6fe3320 a2=4000 a3=0 items=0 ppid=1 pid=1189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:07:31.424000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Dec 16 02:07:31.430000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.432000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.436000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.436000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.103520 systemd[1]: Queued start job for default target multi-user.target. Dec 16 02:07:31.129178 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 16 02:07:31.129717 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 02:07:31.454677 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 02:07:31.459014 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 02:07:31.469808 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 02:07:31.474316 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 02:07:31.478070 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 02:07:31.479000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.485000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.484704 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 02:07:31.496587 systemd-tmpfiles[1214]: ACLs are not supported, ignoring. Dec 16 02:07:31.496598 systemd-tmpfiles[1214]: ACLs are not supported, ignoring. Dec 16 02:07:31.501236 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 02:07:31.501000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.509581 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 02:07:31.510000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.513087 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 02:07:31.514000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.519906 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 02:07:31.525061 kernel: loop1: detected capacity change from 0 to 100192 Dec 16 02:07:31.526324 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 02:07:31.533102 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 02:07:31.536268 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 02:07:31.560213 systemd-journald[1189]: Time spent on flushing to /var/log/journal/a108b4a0f6d04f76852245c450befaaf is 36.926ms for 1303 entries. Dec 16 02:07:31.560213 systemd-journald[1189]: System Journal (/var/log/journal/a108b4a0f6d04f76852245c450befaaf) is 8M, max 588.1M, 580.1M free. Dec 16 02:07:31.610361 systemd-journald[1189]: Received client request to flush runtime journal. Dec 16 02:07:31.610502 kernel: loop2: detected capacity change from 0 to 45344 Dec 16 02:07:31.615055 kernel: loop3: detected capacity change from 0 to 8 Dec 16 02:07:31.615496 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 02:07:31.617000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.618503 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 02:07:31.620000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.621242 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 02:07:31.622000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.626423 kernel: loop4: detected capacity change from 0 to 200800 Dec 16 02:07:31.636264 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 02:07:31.637000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.638000 audit: BPF prog-id=18 op=LOAD Dec 16 02:07:31.639000 audit: BPF prog-id=19 op=LOAD Dec 16 02:07:31.640000 audit: BPF prog-id=20 op=LOAD Dec 16 02:07:31.644000 audit: BPF prog-id=21 op=LOAD Dec 16 02:07:31.641303 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Dec 16 02:07:31.645924 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 02:07:31.651410 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 02:07:31.661051 kernel: loop5: detected capacity change from 0 to 100192 Dec 16 02:07:31.668000 audit: BPF prog-id=22 op=LOAD Dec 16 02:07:31.668000 audit: BPF prog-id=23 op=LOAD Dec 16 02:07:31.668000 audit: BPF prog-id=24 op=LOAD Dec 16 02:07:31.673000 audit: BPF prog-id=25 op=LOAD Dec 16 02:07:31.673000 audit: BPF prog-id=26 op=LOAD Dec 16 02:07:31.673000 audit: BPF prog-id=27 op=LOAD Dec 16 02:07:31.671542 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 02:07:31.674959 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Dec 16 02:07:31.684099 kernel: loop6: detected capacity change from 0 to 45344 Dec 16 02:07:31.692805 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. Dec 16 02:07:31.692816 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. Dec 16 02:07:31.699217 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 02:07:31.700000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:31.709158 kernel: loop7: detected capacity change from 0 to 8 Dec 16 02:07:31.712068 kernel: loop1: detected capacity change from 0 to 200800 Dec 16 02:07:31.726905 (sd-merge)[1268]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Dec 16 02:07:31.734456 (sd-merge)[1268]: Merged extensions into '/usr'. Dec 16 02:07:31.743225 systemd[1]: Reload requested from client PID 1227 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 02:07:31.743250 systemd[1]: Reloading... Dec 16 02:07:31.751458 systemd-nsresourced[1271]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Dec 16 02:07:31.881102 zram_generator::config[1317]: No configuration found. Dec 16 02:07:31.954992 systemd-oomd[1265]: No swap; memory pressure usage will be degraded Dec 16 02:07:31.967096 systemd-resolved[1266]: Positive Trust Anchors: Dec 16 02:07:31.967121 systemd-resolved[1266]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 02:07:31.967125 systemd-resolved[1266]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 02:07:31.967158 systemd-resolved[1266]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 02:07:31.982123 systemd-resolved[1266]: Using system hostname 'ci-4547-0-0-9-be0981937a'. Dec 16 02:07:32.095563 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 02:07:32.095876 systemd[1]: Reloading finished in 352 ms. Dec 16 02:07:32.112073 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Dec 16 02:07:32.115000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.116274 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 02:07:32.117000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.117639 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Dec 16 02:07:32.118000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.118622 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 02:07:32.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.120017 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 02:07:32.120000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.124358 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 02:07:32.134262 systemd[1]: Starting ensure-sysext.service... Dec 16 02:07:32.137294 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 02:07:32.139000 audit: BPF prog-id=28 op=LOAD Dec 16 02:07:32.139000 audit: BPF prog-id=22 op=UNLOAD Dec 16 02:07:32.139000 audit: BPF prog-id=29 op=LOAD Dec 16 02:07:32.139000 audit: BPF prog-id=30 op=LOAD Dec 16 02:07:32.139000 audit: BPF prog-id=23 op=UNLOAD Dec 16 02:07:32.139000 audit: BPF prog-id=24 op=UNLOAD Dec 16 02:07:32.140000 audit: BPF prog-id=31 op=LOAD Dec 16 02:07:32.140000 audit: BPF prog-id=15 op=UNLOAD Dec 16 02:07:32.140000 audit: BPF prog-id=32 op=LOAD Dec 16 02:07:32.140000 audit: BPF prog-id=33 op=LOAD Dec 16 02:07:32.140000 audit: BPF prog-id=16 op=UNLOAD Dec 16 02:07:32.140000 audit: BPF prog-id=17 op=UNLOAD Dec 16 02:07:32.142000 audit: BPF prog-id=34 op=LOAD Dec 16 02:07:32.142000 audit: BPF prog-id=25 op=UNLOAD Dec 16 02:07:32.142000 audit: BPF prog-id=35 op=LOAD Dec 16 02:07:32.142000 audit: BPF prog-id=36 op=LOAD Dec 16 02:07:32.142000 audit: BPF prog-id=26 op=UNLOAD Dec 16 02:07:32.142000 audit: BPF prog-id=27 op=UNLOAD Dec 16 02:07:32.142000 audit: BPF prog-id=37 op=LOAD Dec 16 02:07:32.142000 audit: BPF prog-id=21 op=UNLOAD Dec 16 02:07:32.143000 audit: BPF prog-id=38 op=LOAD Dec 16 02:07:32.143000 audit: BPF prog-id=18 op=UNLOAD Dec 16 02:07:32.143000 audit: BPF prog-id=39 op=LOAD Dec 16 02:07:32.143000 audit: BPF prog-id=40 op=LOAD Dec 16 02:07:32.143000 audit: BPF prog-id=19 op=UNLOAD Dec 16 02:07:32.143000 audit: BPF prog-id=20 op=UNLOAD Dec 16 02:07:32.148725 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 02:07:32.154240 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 02:07:32.154000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.156000 audit: BPF prog-id=8 op=UNLOAD Dec 16 02:07:32.156000 audit: BPF prog-id=7 op=UNLOAD Dec 16 02:07:32.159000 audit: BPF prog-id=41 op=LOAD Dec 16 02:07:32.159000 audit: BPF prog-id=42 op=LOAD Dec 16 02:07:32.160621 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 02:07:32.167299 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 02:07:32.177486 systemd[1]: Reload requested from client PID 1350 ('systemctl') (unit ensure-sysext.service)... Dec 16 02:07:32.177512 systemd[1]: Reloading... Dec 16 02:07:32.180200 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 02:07:32.180223 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 02:07:32.182667 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 02:07:32.183778 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Dec 16 02:07:32.183837 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Dec 16 02:07:32.189963 systemd-tmpfiles[1351]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 02:07:32.189982 systemd-tmpfiles[1351]: Skipping /boot Dec 16 02:07:32.200186 systemd-tmpfiles[1351]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 02:07:32.200200 systemd-tmpfiles[1351]: Skipping /boot Dec 16 02:07:32.230197 systemd-udevd[1355]: Using default interface naming scheme 'v257'. Dec 16 02:07:32.319924 zram_generator::config[1403]: No configuration found. Dec 16 02:07:32.463062 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 02:07:32.563996 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 16 02:07:32.564925 systemd[1]: Reloading finished in 387 ms. Dec 16 02:07:32.578344 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 02:07:32.580000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.582000 audit: BPF prog-id=43 op=LOAD Dec 16 02:07:32.583000 audit: BPF prog-id=31 op=UNLOAD Dec 16 02:07:32.583000 audit: BPF prog-id=44 op=LOAD Dec 16 02:07:32.583000 audit: BPF prog-id=45 op=LOAD Dec 16 02:07:32.583000 audit: BPF prog-id=32 op=UNLOAD Dec 16 02:07:32.583000 audit: BPF prog-id=33 op=UNLOAD Dec 16 02:07:32.584000 audit: BPF prog-id=46 op=LOAD Dec 16 02:07:32.584000 audit: BPF prog-id=28 op=UNLOAD Dec 16 02:07:32.584000 audit: BPF prog-id=47 op=LOAD Dec 16 02:07:32.584000 audit: BPF prog-id=48 op=LOAD Dec 16 02:07:32.585000 audit: BPF prog-id=29 op=UNLOAD Dec 16 02:07:32.585000 audit: BPF prog-id=30 op=UNLOAD Dec 16 02:07:32.585000 audit: BPF prog-id=49 op=LOAD Dec 16 02:07:32.586000 audit: BPF prog-id=38 op=UNLOAD Dec 16 02:07:32.586000 audit: BPF prog-id=50 op=LOAD Dec 16 02:07:32.586000 audit: BPF prog-id=51 op=LOAD Dec 16 02:07:32.586000 audit: BPF prog-id=39 op=UNLOAD Dec 16 02:07:32.586000 audit: BPF prog-id=40 op=UNLOAD Dec 16 02:07:32.588000 audit: BPF prog-id=52 op=LOAD Dec 16 02:07:32.588000 audit: BPF prog-id=34 op=UNLOAD Dec 16 02:07:32.588000 audit: BPF prog-id=53 op=LOAD Dec 16 02:07:32.588000 audit: BPF prog-id=54 op=LOAD Dec 16 02:07:32.588000 audit: BPF prog-id=35 op=UNLOAD Dec 16 02:07:32.588000 audit: BPF prog-id=36 op=UNLOAD Dec 16 02:07:32.589000 audit: BPF prog-id=55 op=LOAD Dec 16 02:07:32.596000 audit: BPF prog-id=56 op=LOAD Dec 16 02:07:32.596000 audit: BPF prog-id=41 op=UNLOAD Dec 16 02:07:32.596000 audit: BPF prog-id=42 op=UNLOAD Dec 16 02:07:32.597000 audit: BPF prog-id=57 op=LOAD Dec 16 02:07:32.597000 audit: BPF prog-id=37 op=UNLOAD Dec 16 02:07:32.604000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.603715 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 02:07:32.629615 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Dec 16 02:07:32.632000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.632438 systemd[1]: Finished ensure-sysext.service. Dec 16 02:07:32.646049 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Dec 16 02:07:32.646107 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 02:07:32.646120 kernel: [drm] features: -context_init Dec 16 02:07:32.650306 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 02:07:32.653856 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 02:07:32.654737 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 02:07:32.659368 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 02:07:32.661303 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 02:07:32.670221 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 02:07:32.675056 kernel: [drm] number of scanouts: 1 Dec 16 02:07:32.675142 kernel: [drm] number of cap sets: 0 Dec 16 02:07:32.679296 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 02:07:32.680644 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 02:07:32.680756 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 02:07:32.683075 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 16 02:07:32.683759 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 02:07:32.684445 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 02:07:32.689000 audit: BPF prog-id=58 op=LOAD Dec 16 02:07:32.686231 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 02:07:32.691474 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 02:07:32.694000 audit: BPF prog-id=59 op=LOAD Dec 16 02:07:32.697318 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 16 02:07:32.705571 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 02:07:32.722952 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 02:07:32.725466 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 02:07:32.726000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.726000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.730000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.728546 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 02:07:32.730094 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 02:07:32.732000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.796189 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 02:07:32.814848 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 02:07:32.815163 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 02:07:32.815000 audit[1483]: SYSTEM_BOOT pid=1483 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Dec 16 02:07:32.861000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Dec 16 02:07:32.861000 audit[1514]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffdfbaa800 a2=420 a3=0 items=0 ppid=1470 pid=1514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:07:32.861000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 02:07:32.862262 augenrules[1514]: No rules Dec 16 02:07:32.868284 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 02:07:32.883700 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 02:07:32.884911 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 02:07:32.888376 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 02:07:32.890205 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 02:07:32.890766 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 02:07:32.895206 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 02:07:32.930305 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 02:07:32.958082 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 02:07:32.960741 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 02:07:32.963154 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 02:07:32.963233 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 02:07:32.964413 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 02:07:32.966115 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 02:07:33.010521 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 02:07:33.060139 systemd-networkd[1480]: lo: Link UP Dec 16 02:07:33.060149 systemd-networkd[1480]: lo: Gained carrier Dec 16 02:07:33.064472 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 02:07:33.066651 systemd[1]: Reached target network.target - Network. Dec 16 02:07:33.067590 systemd-networkd[1480]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 02:07:33.067603 systemd-networkd[1480]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 02:07:33.068323 systemd-networkd[1480]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 02:07:33.068334 systemd-networkd[1480]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 02:07:33.072022 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 02:07:33.072266 systemd-networkd[1480]: eth0: Link UP Dec 16 02:07:33.073420 systemd-networkd[1480]: eth0: Gained carrier Dec 16 02:07:33.074752 systemd-networkd[1480]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 02:07:33.075715 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 02:07:33.083439 systemd-networkd[1480]: eth1: Link UP Dec 16 02:07:33.085319 systemd-networkd[1480]: eth1: Gained carrier Dec 16 02:07:33.085664 systemd-networkd[1480]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 02:07:33.099496 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 16 02:07:33.101854 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 02:07:33.106172 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 02:07:33.118240 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 02:07:33.123181 systemd-networkd[1480]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 02:07:33.126176 systemd-timesyncd[1481]: Network configuration changed, trying to establish connection. Dec 16 02:07:33.133157 systemd-networkd[1480]: eth0: DHCPv4 address 49.13.61.135/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 02:07:33.174672 systemd-timesyncd[1481]: Contacted time server 131.188.3.221:123 (2.flatcar.pool.ntp.org). Dec 16 02:07:33.174773 systemd-timesyncd[1481]: Initial clock synchronization to Tue 2025-12-16 02:07:33.527002 UTC. Dec 16 02:07:33.232802 ldconfig[1478]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 02:07:33.238657 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 02:07:33.242009 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 02:07:33.266409 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 02:07:33.268177 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 02:07:33.269843 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 02:07:33.271359 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 02:07:33.273076 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 02:07:33.273763 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 02:07:33.274553 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Dec 16 02:07:33.275357 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Dec 16 02:07:33.275953 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 02:07:33.276759 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 02:07:33.276798 systemd[1]: Reached target paths.target - Path Units. Dec 16 02:07:33.277374 systemd[1]: Reached target timers.target - Timer Units. Dec 16 02:07:33.278714 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 02:07:33.282249 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 02:07:33.286262 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 02:07:33.287272 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 02:07:33.287988 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 02:07:33.291325 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 02:07:33.292367 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 02:07:33.293867 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 02:07:33.294722 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 02:07:33.295451 systemd[1]: Reached target basic.target - Basic System. Dec 16 02:07:33.296133 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 02:07:33.296164 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 02:07:33.297485 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 02:07:33.300201 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 02:07:33.304347 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 02:07:33.308298 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 02:07:33.314858 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 02:07:33.319351 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 02:07:33.320017 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 02:07:33.321813 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 02:07:33.332132 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 02:07:33.340628 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Dec 16 02:07:33.346959 jq[1549]: false Dec 16 02:07:33.352325 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 02:07:33.359369 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 02:07:33.365302 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 02:07:33.367155 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 02:07:33.367712 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 02:07:33.369224 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 02:07:33.375880 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 02:07:33.383806 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 02:07:33.384975 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 02:07:33.386092 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 02:07:33.386457 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 02:07:33.386675 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 02:07:33.405144 extend-filesystems[1550]: Found /dev/sda6 Dec 16 02:07:33.410633 extend-filesystems[1550]: Found /dev/sda9 Dec 16 02:07:33.411372 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 02:07:33.415612 coreos-metadata[1546]: Dec 16 02:07:33.415 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Dec 16 02:07:33.411693 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 02:07:33.418617 extend-filesystems[1550]: Checking size of /dev/sda9 Dec 16 02:07:33.422231 coreos-metadata[1546]: Dec 16 02:07:33.421 INFO Fetch successful Dec 16 02:07:33.430679 coreos-metadata[1546]: Dec 16 02:07:33.428 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Dec 16 02:07:33.430679 coreos-metadata[1546]: Dec 16 02:07:33.430 INFO Fetch successful Dec 16 02:07:33.433971 jq[1569]: true Dec 16 02:07:33.434232 tar[1571]: linux-arm64/LICENSE Dec 16 02:07:33.434232 tar[1571]: linux-arm64/helm Dec 16 02:07:33.455351 extend-filesystems[1550]: Resized partition /dev/sda9 Dec 16 02:07:33.465013 jq[1594]: true Dec 16 02:07:33.465296 extend-filesystems[1602]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 02:07:33.484514 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 8410107 blocks Dec 16 02:07:33.516339 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 02:07:33.515999 dbus-daemon[1547]: [system] SELinux support is enabled Dec 16 02:07:33.520888 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 02:07:33.520922 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 02:07:33.523259 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 02:07:33.523293 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 02:07:33.524810 update_engine[1565]: I20251216 02:07:33.512841 1565 main.cc:92] Flatcar Update Engine starting Dec 16 02:07:33.539359 systemd[1]: Started update-engine.service - Update Engine. Dec 16 02:07:33.540231 update_engine[1565]: I20251216 02:07:33.539974 1565 update_check_scheduler.cc:74] Next update check in 3m37s Dec 16 02:07:33.548255 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 02:07:33.569785 systemd-logind[1564]: New seat seat0. Dec 16 02:07:33.579623 systemd-logind[1564]: Watching system buttons on /dev/input/event0 (Power Button) Dec 16 02:07:33.579655 systemd-logind[1564]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 16 02:07:33.579967 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 02:07:33.672260 bash[1625]: Updated "/home/core/.ssh/authorized_keys" Dec 16 02:07:33.728451 kernel: EXT4-fs (sda9): resized filesystem to 8410107 Dec 16 02:07:33.682922 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 02:07:33.685476 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 02:07:33.689293 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 02:07:33.695345 systemd[1]: Starting sshkeys.service... Dec 16 02:07:33.730132 extend-filesystems[1602]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Dec 16 02:07:33.730132 extend-filesystems[1602]: old_desc_blocks = 1, new_desc_blocks = 5 Dec 16 02:07:33.730132 extend-filesystems[1602]: The filesystem on /dev/sda9 is now 8410107 (4k) blocks long. Dec 16 02:07:33.736005 extend-filesystems[1550]: Resized filesystem in /dev/sda9 Dec 16 02:07:33.736378 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 02:07:33.740243 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 02:07:33.757735 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 02:07:33.762469 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 02:07:33.827981 coreos-metadata[1642]: Dec 16 02:07:33.827 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Dec 16 02:07:33.828865 coreos-metadata[1642]: Dec 16 02:07:33.828 INFO Fetch successful Dec 16 02:07:33.833184 unknown[1642]: wrote ssh authorized keys file for user: core Dec 16 02:07:33.878352 update-ssh-keys[1646]: Updated "/home/core/.ssh/authorized_keys" Dec 16 02:07:33.877983 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 02:07:33.881317 systemd[1]: Finished sshkeys.service. Dec 16 02:07:33.905810 locksmithd[1612]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 02:07:33.916409 containerd[1587]: time="2025-12-16T02:07:33Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 02:07:33.919978 containerd[1587]: time="2025-12-16T02:07:33.919934720Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Dec 16 02:07:33.942037 containerd[1587]: time="2025-12-16T02:07:33.939890200Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.52µs" Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942184200Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942241200Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942256880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942457800Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942479360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942536560Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942548160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942824280Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942839640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942850320Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942858120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 02:07:33.943785 containerd[1587]: time="2025-12-16T02:07:33.942992880Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 02:07:33.944080 containerd[1587]: time="2025-12-16T02:07:33.943004840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 02:07:33.944080 containerd[1587]: time="2025-12-16T02:07:33.943099120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 02:07:33.944080 containerd[1587]: time="2025-12-16T02:07:33.943274960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 02:07:33.944080 containerd[1587]: time="2025-12-16T02:07:33.943301760Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 02:07:33.944080 containerd[1587]: time="2025-12-16T02:07:33.943311600Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 02:07:33.944080 containerd[1587]: time="2025-12-16T02:07:33.943348000Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 02:07:33.944080 containerd[1587]: time="2025-12-16T02:07:33.943668600Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 02:07:33.944080 containerd[1587]: time="2025-12-16T02:07:33.943733880Z" level=info msg="metadata content store policy set" policy=shared Dec 16 02:07:33.951692 containerd[1587]: time="2025-12-16T02:07:33.951639120Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 02:07:33.951965 containerd[1587]: time="2025-12-16T02:07:33.951941840Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 02:07:33.952292 containerd[1587]: time="2025-12-16T02:07:33.952268920Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 02:07:33.952317 containerd[1587]: time="2025-12-16T02:07:33.952292120Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 02:07:33.952317 containerd[1587]: time="2025-12-16T02:07:33.952307600Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 02:07:33.952374 containerd[1587]: time="2025-12-16T02:07:33.952319880Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 02:07:33.952374 containerd[1587]: time="2025-12-16T02:07:33.952332480Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 02:07:33.952374 containerd[1587]: time="2025-12-16T02:07:33.952343240Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 02:07:33.952374 containerd[1587]: time="2025-12-16T02:07:33.952357280Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 02:07:33.952374 containerd[1587]: time="2025-12-16T02:07:33.952370680Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 02:07:33.952374 containerd[1587]: time="2025-12-16T02:07:33.952394720Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 02:07:33.952511 containerd[1587]: time="2025-12-16T02:07:33.952408680Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 02:07:33.952511 containerd[1587]: time="2025-12-16T02:07:33.952419880Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 02:07:33.952511 containerd[1587]: time="2025-12-16T02:07:33.952432720Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 02:07:33.952604 containerd[1587]: time="2025-12-16T02:07:33.952576520Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 02:07:33.952629 containerd[1587]: time="2025-12-16T02:07:33.952610560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 02:07:33.952647 containerd[1587]: time="2025-12-16T02:07:33.952627920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 02:07:33.952647 containerd[1587]: time="2025-12-16T02:07:33.952638880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 02:07:33.952688 containerd[1587]: time="2025-12-16T02:07:33.952656400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 02:07:33.952688 containerd[1587]: time="2025-12-16T02:07:33.952667520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 02:07:33.952720 containerd[1587]: time="2025-12-16T02:07:33.952698080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 02:07:33.952720 containerd[1587]: time="2025-12-16T02:07:33.952713120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 02:07:33.952756 containerd[1587]: time="2025-12-16T02:07:33.952728680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 02:07:33.952756 containerd[1587]: time="2025-12-16T02:07:33.952740840Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 02:07:33.952756 containerd[1587]: time="2025-12-16T02:07:33.952750960Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 02:07:33.952805 containerd[1587]: time="2025-12-16T02:07:33.952778080Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 02:07:33.952844 containerd[1587]: time="2025-12-16T02:07:33.952825400Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 02:07:33.952873 containerd[1587]: time="2025-12-16T02:07:33.952845720Z" level=info msg="Start snapshots syncer" Dec 16 02:07:33.955076 containerd[1587]: time="2025-12-16T02:07:33.954415160Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 02:07:33.955408 containerd[1587]: time="2025-12-16T02:07:33.955353880Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 02:07:33.955506 containerd[1587]: time="2025-12-16T02:07:33.955427240Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 02:07:33.956478 containerd[1587]: time="2025-12-16T02:07:33.956447360Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 02:07:33.957149 containerd[1587]: time="2025-12-16T02:07:33.957122120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 02:07:33.957197 containerd[1587]: time="2025-12-16T02:07:33.957180120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 02:07:33.957220 containerd[1587]: time="2025-12-16T02:07:33.957201600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 02:07:33.957220 containerd[1587]: time="2025-12-16T02:07:33.957214120Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 02:07:33.957258 containerd[1587]: time="2025-12-16T02:07:33.957227000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 02:07:33.957258 containerd[1587]: time="2025-12-16T02:07:33.957239080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 02:07:33.957258 containerd[1587]: time="2025-12-16T02:07:33.957252160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 02:07:33.957309 containerd[1587]: time="2025-12-16T02:07:33.957262640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 02:07:33.957505 containerd[1587]: time="2025-12-16T02:07:33.957483200Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 02:07:33.958818 containerd[1587]: time="2025-12-16T02:07:33.958789400Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 02:07:33.959574 containerd[1587]: time="2025-12-16T02:07:33.959545440Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 02:07:33.959604 containerd[1587]: time="2025-12-16T02:07:33.959573920Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 02:07:33.959604 containerd[1587]: time="2025-12-16T02:07:33.959587360Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 02:07:33.959604 containerd[1587]: time="2025-12-16T02:07:33.959595960Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 02:07:33.959667 containerd[1587]: time="2025-12-16T02:07:33.959611760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 02:07:33.959667 containerd[1587]: time="2025-12-16T02:07:33.959625240Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 02:07:33.959720 containerd[1587]: time="2025-12-16T02:07:33.959705920Z" level=info msg="runtime interface created" Dec 16 02:07:33.959720 containerd[1587]: time="2025-12-16T02:07:33.959715840Z" level=info msg="created NRI interface" Dec 16 02:07:33.959764 containerd[1587]: time="2025-12-16T02:07:33.959726440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 02:07:33.959764 containerd[1587]: time="2025-12-16T02:07:33.959742800Z" level=info msg="Connect containerd service" Dec 16 02:07:33.959797 containerd[1587]: time="2025-12-16T02:07:33.959776640Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 02:07:33.963518 containerd[1587]: time="2025-12-16T02:07:33.963480600Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 02:07:33.994053 sshd_keygen[1597]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 02:07:34.052695 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 02:07:34.059203 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088207341Z" level=info msg="Start subscribing containerd event" Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088299134Z" level=info msg="Start recovering state" Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088401118Z" level=info msg="Start event monitor" Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088414858Z" level=info msg="Start cni network conf syncer for default" Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088422250Z" level=info msg="Start streaming server" Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088430978Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088438119Z" level=info msg="runtime interface starting up..." Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088444258Z" level=info msg="starting plugins..." Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088462174Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.088984912Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 02:07:34.089375 containerd[1587]: time="2025-12-16T02:07:34.089100301Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 02:07:34.089440 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 02:07:34.091162 containerd[1587]: time="2025-12-16T02:07:34.090616604Z" level=info msg="containerd successfully booted in 0.174822s" Dec 16 02:07:34.096062 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 02:07:34.097307 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 02:07:34.103336 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 02:07:34.126465 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 02:07:34.131459 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 02:07:34.135048 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 16 02:07:34.137450 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 02:07:34.162678 tar[1571]: linux-arm64/README.md Dec 16 02:07:34.186208 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 02:07:34.272289 systemd-networkd[1480]: eth0: Gained IPv6LL Dec 16 02:07:34.276657 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 02:07:34.278422 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 02:07:34.282969 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:07:34.288470 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 02:07:34.326480 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 02:07:34.591225 systemd-networkd[1480]: eth1: Gained IPv6LL Dec 16 02:07:35.102957 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:07:35.107940 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 02:07:35.110772 systemd[1]: Startup finished in 1.796s (kernel) + 5.441s (initrd) + 4.577s (userspace) = 11.814s. Dec 16 02:07:35.111549 (kubelet)[1704]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 02:07:35.592420 kubelet[1704]: E1216 02:07:35.592372 1704 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 02:07:35.596234 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 02:07:35.596431 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 02:07:35.597362 systemd[1]: kubelet.service: Consumed 828ms CPU time, 246.3M memory peak. Dec 16 02:07:45.847869 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 02:07:45.850878 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:07:46.034845 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:07:46.045598 (kubelet)[1723]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 02:07:46.094225 kubelet[1723]: E1216 02:07:46.094167 1723 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 02:07:46.097461 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 02:07:46.097720 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 02:07:46.098533 systemd[1]: kubelet.service: Consumed 175ms CPU time, 107.3M memory peak. Dec 16 02:07:56.183339 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 02:07:56.186871 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:07:56.349444 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:07:56.364928 (kubelet)[1737]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 02:07:56.414971 kubelet[1737]: E1216 02:07:56.414920 1737 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 02:07:56.418814 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 02:07:56.418992 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 02:07:56.420190 systemd[1]: kubelet.service: Consumed 172ms CPU time, 106.7M memory peak. Dec 16 02:08:06.432842 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 02:08:06.435273 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:08:06.598334 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:08:06.612600 (kubelet)[1752]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 02:08:06.657055 kubelet[1752]: E1216 02:08:06.656934 1752 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 02:08:06.661427 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 02:08:06.661856 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 02:08:06.662886 systemd[1]: kubelet.service: Consumed 170ms CPU time, 106.7M memory peak. Dec 16 02:08:08.825746 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 02:08:08.827760 systemd[1]: Started sshd@0-49.13.61.135:22-139.178.68.195:48964.service - OpenSSH per-connection server daemon (139.178.68.195:48964). Dec 16 02:08:09.688064 sshd[1760]: Accepted publickey for core from 139.178.68.195 port 48964 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:08:09.692207 sshd-session[1760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:08:09.705068 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 02:08:09.706434 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 02:08:09.713120 systemd-logind[1564]: New session 1 of user core. Dec 16 02:08:09.739017 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 02:08:09.743799 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 02:08:09.771466 (systemd)[1766]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:08:09.775014 systemd-logind[1564]: New session 2 of user core. Dec 16 02:08:09.911399 systemd[1766]: Queued start job for default target default.target. Dec 16 02:08:09.934936 systemd[1766]: Created slice app.slice - User Application Slice. Dec 16 02:08:09.935253 systemd[1766]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Dec 16 02:08:09.935280 systemd[1766]: Reached target paths.target - Paths. Dec 16 02:08:09.935371 systemd[1766]: Reached target timers.target - Timers. Dec 16 02:08:09.937443 systemd[1766]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 02:08:09.938739 systemd[1766]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Dec 16 02:08:09.963360 systemd[1766]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Dec 16 02:08:09.964260 systemd[1766]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 02:08:09.964383 systemd[1766]: Reached target sockets.target - Sockets. Dec 16 02:08:09.964438 systemd[1766]: Reached target basic.target - Basic System. Dec 16 02:08:09.964468 systemd[1766]: Reached target default.target - Main User Target. Dec 16 02:08:09.964495 systemd[1766]: Startup finished in 182ms. Dec 16 02:08:09.964861 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 02:08:09.969274 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 02:08:10.457504 systemd[1]: Started sshd@1-49.13.61.135:22-139.178.68.195:40096.service - OpenSSH per-connection server daemon (139.178.68.195:40096). Dec 16 02:08:11.314287 sshd[1780]: Accepted publickey for core from 139.178.68.195 port 40096 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:08:11.316266 sshd-session[1780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:08:11.323795 systemd-logind[1564]: New session 3 of user core. Dec 16 02:08:11.339355 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 02:08:11.793324 sshd[1784]: Connection closed by 139.178.68.195 port 40096 Dec 16 02:08:11.794350 sshd-session[1780]: pam_unix(sshd:session): session closed for user core Dec 16 02:08:11.800543 systemd[1]: sshd@1-49.13.61.135:22-139.178.68.195:40096.service: Deactivated successfully. Dec 16 02:08:11.802555 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 02:08:11.804380 systemd-logind[1564]: Session 3 logged out. Waiting for processes to exit. Dec 16 02:08:11.805909 systemd-logind[1564]: Removed session 3. Dec 16 02:08:11.964149 systemd[1]: Started sshd@2-49.13.61.135:22-139.178.68.195:40102.service - OpenSSH per-connection server daemon (139.178.68.195:40102). Dec 16 02:08:12.807527 sshd[1790]: Accepted publickey for core from 139.178.68.195 port 40102 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:08:12.810078 sshd-session[1790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:08:12.817317 systemd-logind[1564]: New session 4 of user core. Dec 16 02:08:12.827408 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 02:08:13.278928 sshd[1794]: Connection closed by 139.178.68.195 port 40102 Dec 16 02:08:13.279820 sshd-session[1790]: pam_unix(sshd:session): session closed for user core Dec 16 02:08:13.285754 systemd-logind[1564]: Session 4 logged out. Waiting for processes to exit. Dec 16 02:08:13.286446 systemd[1]: sshd@2-49.13.61.135:22-139.178.68.195:40102.service: Deactivated successfully. Dec 16 02:08:13.290610 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 02:08:13.293399 systemd-logind[1564]: Removed session 4. Dec 16 02:08:13.453383 systemd[1]: Started sshd@3-49.13.61.135:22-139.178.68.195:40112.service - OpenSSH per-connection server daemon (139.178.68.195:40112). Dec 16 02:08:14.307893 sshd[1800]: Accepted publickey for core from 139.178.68.195 port 40112 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:08:14.309581 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:08:14.315082 systemd-logind[1564]: New session 5 of user core. Dec 16 02:08:14.322356 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 02:08:14.784605 sshd[1804]: Connection closed by 139.178.68.195 port 40112 Dec 16 02:08:14.785268 sshd-session[1800]: pam_unix(sshd:session): session closed for user core Dec 16 02:08:14.790881 systemd[1]: sshd@3-49.13.61.135:22-139.178.68.195:40112.service: Deactivated successfully. Dec 16 02:08:14.793348 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 02:08:14.794822 systemd-logind[1564]: Session 5 logged out. Waiting for processes to exit. Dec 16 02:08:14.796708 systemd-logind[1564]: Removed session 5. Dec 16 02:08:14.952248 systemd[1]: Started sshd@4-49.13.61.135:22-139.178.68.195:40126.service - OpenSSH per-connection server daemon (139.178.68.195:40126). Dec 16 02:08:15.798893 sshd[1810]: Accepted publickey for core from 139.178.68.195 port 40126 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:08:15.800744 sshd-session[1810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:08:15.806462 systemd-logind[1564]: New session 6 of user core. Dec 16 02:08:15.817660 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 02:08:16.128293 sudo[1815]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 02:08:16.128596 sudo[1815]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 02:08:16.140149 sudo[1815]: pam_unix(sudo:session): session closed for user root Dec 16 02:08:16.296217 sshd[1814]: Connection closed by 139.178.68.195 port 40126 Dec 16 02:08:16.297060 sshd-session[1810]: pam_unix(sshd:session): session closed for user core Dec 16 02:08:16.303766 systemd[1]: sshd@4-49.13.61.135:22-139.178.68.195:40126.service: Deactivated successfully. Dec 16 02:08:16.306683 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 02:08:16.308021 systemd-logind[1564]: Session 6 logged out. Waiting for processes to exit. Dec 16 02:08:16.309793 systemd-logind[1564]: Removed session 6. Dec 16 02:08:16.468263 systemd[1]: Started sshd@5-49.13.61.135:22-139.178.68.195:40130.service - OpenSSH per-connection server daemon (139.178.68.195:40130). Dec 16 02:08:16.682861 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 16 02:08:16.685836 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:08:16.852474 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:08:16.860625 (kubelet)[1833]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 02:08:16.906145 kubelet[1833]: E1216 02:08:16.906095 1833 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 02:08:16.908779 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 02:08:16.908908 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 02:08:16.909532 systemd[1]: kubelet.service: Consumed 172ms CPU time, 106.9M memory peak. Dec 16 02:08:17.303860 sshd[1822]: Accepted publickey for core from 139.178.68.195 port 40130 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:08:17.305643 sshd-session[1822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:08:17.313414 systemd-logind[1564]: New session 7 of user core. Dec 16 02:08:17.320470 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 02:08:17.623686 sudo[1842]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 02:08:17.624139 sudo[1842]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 02:08:17.627627 sudo[1842]: pam_unix(sudo:session): session closed for user root Dec 16 02:08:17.635902 sudo[1841]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 02:08:17.636193 sudo[1841]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 02:08:17.646104 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 02:08:17.692000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 02:08:17.694136 kernel: kauditd_printk_skb: 179 callbacks suppressed Dec 16 02:08:17.694214 kernel: audit: type=1305 audit(1765850897.692:224): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 02:08:17.694439 augenrules[1866]: No rules Dec 16 02:08:17.692000 audit[1866]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffdab3e1b0 a2=420 a3=0 items=0 ppid=1847 pid=1866 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:17.696992 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 02:08:17.697867 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 02:08:17.698418 kernel: audit: type=1300 audit(1765850897.692:224): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffdab3e1b0 a2=420 a3=0 items=0 ppid=1847 pid=1866 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:17.692000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 02:08:17.700042 kernel: audit: type=1327 audit(1765850897.692:224): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 02:08:17.700103 kernel: audit: type=1130 audit(1765850897.694:225): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.694000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.701731 sudo[1841]: pam_unix(sudo:session): session closed for user root Dec 16 02:08:17.703213 kernel: audit: type=1131 audit(1765850897.694:226): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.703286 kernel: audit: type=1106 audit(1765850897.700:227): pid=1841 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.700000 audit[1841]: USER_END pid=1841 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.700000 audit[1841]: CRED_DISP pid=1841 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.706379 kernel: audit: type=1104 audit(1765850897.700:228): pid=1841 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.857746 sshd[1840]: Connection closed by 139.178.68.195 port 40130 Dec 16 02:08:17.857000 sshd-session[1822]: pam_unix(sshd:session): session closed for user core Dec 16 02:08:17.858000 audit[1822]: USER_END pid=1822 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:08:17.858000 audit[1822]: CRED_DISP pid=1822 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:08:17.864815 kernel: audit: type=1106 audit(1765850897.858:229): pid=1822 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:08:17.864921 kernel: audit: type=1104 audit(1765850897.858:230): pid=1822 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:08:17.867523 systemd-logind[1564]: Session 7 logged out. Waiting for processes to exit. Dec 16 02:08:17.868184 systemd[1]: sshd@5-49.13.61.135:22-139.178.68.195:40130.service: Deactivated successfully. Dec 16 02:08:17.867000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-49.13.61.135:22-139.178.68.195:40130 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.870421 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 02:08:17.874102 kernel: audit: type=1131 audit(1765850897.867:231): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-49.13.61.135:22-139.178.68.195:40130 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:17.875429 systemd-logind[1564]: Removed session 7. Dec 16 02:08:18.030011 systemd[1]: Started sshd@6-49.13.61.135:22-139.178.68.195:40146.service - OpenSSH per-connection server daemon (139.178.68.195:40146). Dec 16 02:08:18.029000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-49.13.61.135:22-139.178.68.195:40146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:18.297148 update_engine[1565]: I20251216 02:08:18.296554 1565 update_attempter.cc:509] Updating boot flags... Dec 16 02:08:18.880000 audit[1875]: USER_ACCT pid=1875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:08:18.881553 sshd[1875]: Accepted publickey for core from 139.178.68.195 port 40146 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:08:18.881000 audit[1875]: CRED_ACQ pid=1875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:08:18.881000 audit[1875]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffac5a210 a2=3 a3=0 items=0 ppid=1 pid=1875 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:18.881000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:08:18.883379 sshd-session[1875]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:08:18.891473 systemd-logind[1564]: New session 8 of user core. Dec 16 02:08:18.897402 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 02:08:18.899000 audit[1875]: USER_START pid=1875 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:08:18.901000 audit[1899]: CRED_ACQ pid=1899 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:08:19.202000 audit[1900]: USER_ACCT pid=1900 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:08:19.204190 sudo[1900]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 02:08:19.203000 audit[1900]: CRED_REFR pid=1900 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:08:19.203000 audit[1900]: USER_START pid=1900 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:08:19.204845 sudo[1900]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 02:08:19.519335 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 02:08:19.541976 (dockerd)[1918]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 02:08:19.780914 dockerd[1918]: time="2025-12-16T02:08:19.780589500Z" level=info msg="Starting up" Dec 16 02:08:19.786961 dockerd[1918]: time="2025-12-16T02:08:19.786811933Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 02:08:19.801578 dockerd[1918]: time="2025-12-16T02:08:19.801484382Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 02:08:19.823204 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport784930589-merged.mount: Deactivated successfully. Dec 16 02:08:19.835597 systemd[1]: var-lib-docker-metacopy\x2dcheck2451857641-merged.mount: Deactivated successfully. Dec 16 02:08:19.847631 dockerd[1918]: time="2025-12-16T02:08:19.847331544Z" level=info msg="Loading containers: start." Dec 16 02:08:19.859145 kernel: Initializing XFRM netlink socket Dec 16 02:08:19.912000 audit[1965]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1965 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.912000 audit[1965]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffdd661430 a2=0 a3=0 items=0 ppid=1918 pid=1965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.912000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 02:08:19.914000 audit[1967]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1967 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.914000 audit[1967]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffcf1bbf10 a2=0 a3=0 items=0 ppid=1918 pid=1967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.914000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 02:08:19.916000 audit[1969]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1969 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.916000 audit[1969]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdc2455c0 a2=0 a3=0 items=0 ppid=1918 pid=1969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.916000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 02:08:19.919000 audit[1971]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1971 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.919000 audit[1971]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc2a45eb0 a2=0 a3=0 items=0 ppid=1918 pid=1971 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.919000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 02:08:19.921000 audit[1973]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=1973 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.921000 audit[1973]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff0c26d10 a2=0 a3=0 items=0 ppid=1918 pid=1973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.921000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 02:08:19.923000 audit[1975]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=1975 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.923000 audit[1975]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe90c1340 a2=0 a3=0 items=0 ppid=1918 pid=1975 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.923000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 02:08:19.926000 audit[1977]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1977 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.926000 audit[1977]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc038d430 a2=0 a3=0 items=0 ppid=1918 pid=1977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.926000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 02:08:19.928000 audit[1979]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=1979 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.928000 audit[1979]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffcda3d620 a2=0 a3=0 items=0 ppid=1918 pid=1979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.928000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 02:08:19.961000 audit[1982]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=1982 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.961000 audit[1982]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffc26d1590 a2=0 a3=0 items=0 ppid=1918 pid=1982 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.961000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Dec 16 02:08:19.963000 audit[1984]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=1984 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.963000 audit[1984]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffd0c85490 a2=0 a3=0 items=0 ppid=1918 pid=1984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.963000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 02:08:19.966000 audit[1986]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1986 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.966000 audit[1986]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffc925f3f0 a2=0 a3=0 items=0 ppid=1918 pid=1986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.966000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 02:08:19.968000 audit[1988]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=1988 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.968000 audit[1988]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffd1287670 a2=0 a3=0 items=0 ppid=1918 pid=1988 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.968000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 02:08:19.971000 audit[1990]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=1990 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:19.971000 audit[1990]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffe7a7f170 a2=0 a3=0 items=0 ppid=1918 pid=1990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:19.971000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 02:08:20.014000 audit[2020]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2020 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.014000 audit[2020]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffcb545e40 a2=0 a3=0 items=0 ppid=1918 pid=2020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.014000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 02:08:20.016000 audit[2022]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2022 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.016000 audit[2022]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffe1f5e130 a2=0 a3=0 items=0 ppid=1918 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.016000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 02:08:20.018000 audit[2024]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2024 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.018000 audit[2024]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdbb69690 a2=0 a3=0 items=0 ppid=1918 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.018000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 02:08:20.020000 audit[2026]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2026 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.020000 audit[2026]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc3f29240 a2=0 a3=0 items=0 ppid=1918 pid=2026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.020000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 02:08:20.022000 audit[2028]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2028 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.022000 audit[2028]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd2d4f7c0 a2=0 a3=0 items=0 ppid=1918 pid=2028 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.022000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 02:08:20.024000 audit[2030]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2030 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.024000 audit[2030]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffdc266fb0 a2=0 a3=0 items=0 ppid=1918 pid=2030 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.024000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 02:08:20.026000 audit[2032]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2032 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.026000 audit[2032]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffff9a25e50 a2=0 a3=0 items=0 ppid=1918 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.026000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 02:08:20.029000 audit[2034]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.029000 audit[2034]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffdfa04080 a2=0 a3=0 items=0 ppid=1918 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.029000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 02:08:20.032000 audit[2036]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2036 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.032000 audit[2036]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffe78c0a70 a2=0 a3=0 items=0 ppid=1918 pid=2036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.032000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Dec 16 02:08:20.036000 audit[2038]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2038 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.036000 audit[2038]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff8482820 a2=0 a3=0 items=0 ppid=1918 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.036000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 02:08:20.038000 audit[2040]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2040 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.038000 audit[2040]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffc0c48bf0 a2=0 a3=0 items=0 ppid=1918 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.038000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 02:08:20.040000 audit[2042]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2042 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.040000 audit[2042]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffc6b27a60 a2=0 a3=0 items=0 ppid=1918 pid=2042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.040000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 02:08:20.042000 audit[2044]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2044 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.042000 audit[2044]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=fffff67aa270 a2=0 a3=0 items=0 ppid=1918 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.042000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 02:08:20.049000 audit[2049]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2049 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.049000 audit[2049]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd350f5f0 a2=0 a3=0 items=0 ppid=1918 pid=2049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.049000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 02:08:20.052000 audit[2051]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.052000 audit[2051]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffc59377b0 a2=0 a3=0 items=0 ppid=1918 pid=2051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.052000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 02:08:20.055000 audit[2053]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.055000 audit[2053]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffc6682fc0 a2=0 a3=0 items=0 ppid=1918 pid=2053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.055000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 02:08:20.057000 audit[2055]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2055 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.057000 audit[2055]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc6495880 a2=0 a3=0 items=0 ppid=1918 pid=2055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.057000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 02:08:20.060000 audit[2057]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2057 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.060000 audit[2057]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffdfd32fe0 a2=0 a3=0 items=0 ppid=1918 pid=2057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.060000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 02:08:20.062000 audit[2059]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2059 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:20.062000 audit[2059]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffd2bb6560 a2=0 a3=0 items=0 ppid=1918 pid=2059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.062000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 02:08:20.089000 audit[2063]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.089000 audit[2063]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffd7f6f930 a2=0 a3=0 items=0 ppid=1918 pid=2063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.089000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Dec 16 02:08:20.093000 audit[2065]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.093000 audit[2065]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffe812de10 a2=0 a3=0 items=0 ppid=1918 pid=2065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.093000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Dec 16 02:08:20.103000 audit[2073]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2073 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.103000 audit[2073]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=fffff3de4de0 a2=0 a3=0 items=0 ppid=1918 pid=2073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.103000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Dec 16 02:08:20.117000 audit[2079]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2079 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.117000 audit[2079]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffda1de360 a2=0 a3=0 items=0 ppid=1918 pid=2079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.117000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Dec 16 02:08:20.120000 audit[2081]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2081 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.120000 audit[2081]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffd0419730 a2=0 a3=0 items=0 ppid=1918 pid=2081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.120000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Dec 16 02:08:20.124000 audit[2083]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2083 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.124000 audit[2083]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=fffffa149640 a2=0 a3=0 items=0 ppid=1918 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.124000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Dec 16 02:08:20.127000 audit[2085]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2085 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.127000 audit[2085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffd9ae0550 a2=0 a3=0 items=0 ppid=1918 pid=2085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.127000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 02:08:20.129000 audit[2087]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2087 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:20.129000 audit[2087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffcbd79420 a2=0 a3=0 items=0 ppid=1918 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:20.129000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Dec 16 02:08:20.131138 systemd-networkd[1480]: docker0: Link UP Dec 16 02:08:20.138537 dockerd[1918]: time="2025-12-16T02:08:20.138356123Z" level=info msg="Loading containers: done." Dec 16 02:08:20.162285 dockerd[1918]: time="2025-12-16T02:08:20.161473591Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 02:08:20.162285 dockerd[1918]: time="2025-12-16T02:08:20.161567549Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 02:08:20.162285 dockerd[1918]: time="2025-12-16T02:08:20.161723732Z" level=info msg="Initializing buildkit" Dec 16 02:08:20.191329 dockerd[1918]: time="2025-12-16T02:08:20.191288717Z" level=info msg="Completed buildkit initialization" Dec 16 02:08:20.201392 dockerd[1918]: time="2025-12-16T02:08:20.201326679Z" level=info msg="Daemon has completed initialization" Dec 16 02:08:20.201000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:20.203204 dockerd[1918]: time="2025-12-16T02:08:20.201746848Z" level=info msg="API listen on /run/docker.sock" Dec 16 02:08:20.202471 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 02:08:20.821056 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3612439815-merged.mount: Deactivated successfully. Dec 16 02:08:20.932166 containerd[1587]: time="2025-12-16T02:08:20.932108583Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 16 02:08:21.673153 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount383988859.mount: Deactivated successfully. Dec 16 02:08:22.414452 containerd[1587]: time="2025-12-16T02:08:22.414403509Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:22.416247 containerd[1587]: time="2025-12-16T02:08:22.416189362Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=22974850" Dec 16 02:08:22.417860 containerd[1587]: time="2025-12-16T02:08:22.417793628Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:22.422611 containerd[1587]: time="2025-12-16T02:08:22.422531359Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:22.423793 containerd[1587]: time="2025-12-16T02:08:22.423611234Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.491443908s" Dec 16 02:08:22.423793 containerd[1587]: time="2025-12-16T02:08:22.423655970Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Dec 16 02:08:22.424711 containerd[1587]: time="2025-12-16T02:08:22.424431334Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 16 02:08:23.538070 containerd[1587]: time="2025-12-16T02:08:23.536643292Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:23.538546 containerd[1587]: time="2025-12-16T02:08:23.538075311Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=0" Dec 16 02:08:23.539270 containerd[1587]: time="2025-12-16T02:08:23.539229273Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:23.542731 containerd[1587]: time="2025-12-16T02:08:23.542688519Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:23.544349 containerd[1587]: time="2025-12-16T02:08:23.544301041Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.119831973s" Dec 16 02:08:23.544448 containerd[1587]: time="2025-12-16T02:08:23.544433487Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Dec 16 02:08:23.545208 containerd[1587]: time="2025-12-16T02:08:23.545176986Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 16 02:08:24.396911 containerd[1587]: time="2025-12-16T02:08:24.396709895Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:24.398212 containerd[1587]: time="2025-12-16T02:08:24.398152855Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=0" Dec 16 02:08:24.399659 containerd[1587]: time="2025-12-16T02:08:24.399603537Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:24.407176 containerd[1587]: time="2025-12-16T02:08:24.407104633Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:24.408407 containerd[1587]: time="2025-12-16T02:08:24.408340564Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 863.003963ms" Dec 16 02:08:24.408407 containerd[1587]: time="2025-12-16T02:08:24.408395182Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Dec 16 02:08:24.410482 containerd[1587]: time="2025-12-16T02:08:24.410435061Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 16 02:08:25.320199 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2748057613.mount: Deactivated successfully. Dec 16 02:08:25.558994 containerd[1587]: time="2025-12-16T02:08:25.558133206Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:25.560990 containerd[1587]: time="2025-12-16T02:08:25.560906607Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=0" Dec 16 02:08:25.562364 containerd[1587]: time="2025-12-16T02:08:25.562293808Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:25.564565 containerd[1587]: time="2025-12-16T02:08:25.564508472Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:25.565408 containerd[1587]: time="2025-12-16T02:08:25.565378669Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.154900314s" Dec 16 02:08:25.565511 containerd[1587]: time="2025-12-16T02:08:25.565494985Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Dec 16 02:08:25.566118 containerd[1587]: time="2025-12-16T02:08:25.566087694Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 16 02:08:26.130186 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount901610549.mount: Deactivated successfully. Dec 16 02:08:26.896699 containerd[1587]: time="2025-12-16T02:08:26.896634432Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:26.898368 containerd[1587]: time="2025-12-16T02:08:26.898058585Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=19698598" Dec 16 02:08:26.899451 containerd[1587]: time="2025-12-16T02:08:26.899398672Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:26.902567 containerd[1587]: time="2025-12-16T02:08:26.902519701Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:26.904135 containerd[1587]: time="2025-12-16T02:08:26.904084016Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.33776997s" Dec 16 02:08:26.904135 containerd[1587]: time="2025-12-16T02:08:26.904134791Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Dec 16 02:08:26.905117 containerd[1587]: time="2025-12-16T02:08:26.904897263Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 16 02:08:26.932387 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Dec 16 02:08:26.935298 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:08:27.098690 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:08:27.097000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:27.099640 kernel: kauditd_printk_skb: 132 callbacks suppressed Dec 16 02:08:27.099754 kernel: audit: type=1130 audit(1765850907.097:282): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:27.120805 (kubelet)[2261]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 02:08:27.166939 kubelet[2261]: E1216 02:08:27.166818 2261 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 02:08:27.170951 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 02:08:27.171206 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 02:08:27.172125 systemd[1]: kubelet.service: Consumed 173ms CPU time, 106.8M memory peak. Dec 16 02:08:27.171000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 02:08:27.177091 kernel: audit: type=1131 audit(1765850907.171:283): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 02:08:27.425329 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount87364913.mount: Deactivated successfully. Dec 16 02:08:27.433114 containerd[1587]: time="2025-12-16T02:08:27.433013118Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:27.435488 containerd[1587]: time="2025-12-16T02:08:27.435368723Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Dec 16 02:08:27.437884 containerd[1587]: time="2025-12-16T02:08:27.437844003Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:27.439402 containerd[1587]: time="2025-12-16T02:08:27.439364605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:27.440847 containerd[1587]: time="2025-12-16T02:08:27.440811466Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 535.880393ms" Dec 16 02:08:27.440913 containerd[1587]: time="2025-12-16T02:08:27.440853519Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Dec 16 02:08:27.441414 containerd[1587]: time="2025-12-16T02:08:27.441389674Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 16 02:08:28.016319 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4267641676.mount: Deactivated successfully. Dec 16 02:08:30.111752 containerd[1587]: time="2025-12-16T02:08:30.111650699Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:30.114073 containerd[1587]: time="2025-12-16T02:08:30.113851703Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=85821047" Dec 16 02:08:30.115306 containerd[1587]: time="2025-12-16T02:08:30.115253702Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:30.123405 containerd[1587]: time="2025-12-16T02:08:30.123323051Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:30.128094 containerd[1587]: time="2025-12-16T02:08:30.127136789Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 2.685712105s" Dec 16 02:08:30.128094 containerd[1587]: time="2025-12-16T02:08:30.127199525Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Dec 16 02:08:36.087630 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:08:36.087828 systemd[1]: kubelet.service: Consumed 173ms CPU time, 106.8M memory peak. Dec 16 02:08:36.086000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:36.086000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:36.092235 kernel: audit: type=1130 audit(1765850916.086:284): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:36.092330 kernel: audit: type=1131 audit(1765850916.086:285): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:36.092624 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:08:36.132323 systemd[1]: Reload requested from client PID 2353 ('systemctl') (unit session-8.scope)... Dec 16 02:08:36.132340 systemd[1]: Reloading... Dec 16 02:08:36.272136 zram_generator::config[2403]: No configuration found. Dec 16 02:08:36.469389 systemd[1]: Reloading finished in 336 ms. Dec 16 02:08:36.507469 kernel: audit: type=1334 audit(1765850916.504:286): prog-id=63 op=LOAD Dec 16 02:08:36.507616 kernel: audit: type=1334 audit(1765850916.504:287): prog-id=49 op=UNLOAD Dec 16 02:08:36.507647 kernel: audit: type=1334 audit(1765850916.504:288): prog-id=64 op=LOAD Dec 16 02:08:36.504000 audit: BPF prog-id=63 op=LOAD Dec 16 02:08:36.504000 audit: BPF prog-id=49 op=UNLOAD Dec 16 02:08:36.504000 audit: BPF prog-id=64 op=LOAD Dec 16 02:08:36.510517 kernel: audit: type=1334 audit(1765850916.504:289): prog-id=65 op=LOAD Dec 16 02:08:36.510642 kernel: audit: type=1334 audit(1765850916.504:290): prog-id=50 op=UNLOAD Dec 16 02:08:36.510679 kernel: audit: type=1334 audit(1765850916.504:291): prog-id=51 op=UNLOAD Dec 16 02:08:36.504000 audit: BPF prog-id=65 op=LOAD Dec 16 02:08:36.504000 audit: BPF prog-id=50 op=UNLOAD Dec 16 02:08:36.504000 audit: BPF prog-id=51 op=UNLOAD Dec 16 02:08:36.505000 audit: BPF prog-id=66 op=LOAD Dec 16 02:08:36.512092 kernel: audit: type=1334 audit(1765850916.505:292): prog-id=66 op=LOAD Dec 16 02:08:36.512122 kernel: audit: type=1334 audit(1765850916.505:293): prog-id=43 op=UNLOAD Dec 16 02:08:36.505000 audit: BPF prog-id=43 op=UNLOAD Dec 16 02:08:36.508000 audit: BPF prog-id=67 op=LOAD Dec 16 02:08:36.508000 audit: BPF prog-id=68 op=LOAD Dec 16 02:08:36.508000 audit: BPF prog-id=44 op=UNLOAD Dec 16 02:08:36.508000 audit: BPF prog-id=45 op=UNLOAD Dec 16 02:08:36.510000 audit: BPF prog-id=69 op=LOAD Dec 16 02:08:36.510000 audit: BPF prog-id=52 op=UNLOAD Dec 16 02:08:36.510000 audit: BPF prog-id=70 op=LOAD Dec 16 02:08:36.511000 audit: BPF prog-id=71 op=LOAD Dec 16 02:08:36.511000 audit: BPF prog-id=53 op=UNLOAD Dec 16 02:08:36.511000 audit: BPF prog-id=54 op=UNLOAD Dec 16 02:08:36.511000 audit: BPF prog-id=72 op=LOAD Dec 16 02:08:36.511000 audit: BPF prog-id=46 op=UNLOAD Dec 16 02:08:36.511000 audit: BPF prog-id=73 op=LOAD Dec 16 02:08:36.511000 audit: BPF prog-id=74 op=LOAD Dec 16 02:08:36.511000 audit: BPF prog-id=47 op=UNLOAD Dec 16 02:08:36.511000 audit: BPF prog-id=48 op=UNLOAD Dec 16 02:08:36.512000 audit: BPF prog-id=75 op=LOAD Dec 16 02:08:36.512000 audit: BPF prog-id=59 op=UNLOAD Dec 16 02:08:36.513000 audit: BPF prog-id=76 op=LOAD Dec 16 02:08:36.516000 audit: BPF prog-id=58 op=UNLOAD Dec 16 02:08:36.516000 audit: BPF prog-id=77 op=LOAD Dec 16 02:08:36.516000 audit: BPF prog-id=78 op=LOAD Dec 16 02:08:36.516000 audit: BPF prog-id=55 op=UNLOAD Dec 16 02:08:36.516000 audit: BPF prog-id=56 op=UNLOAD Dec 16 02:08:36.517000 audit: BPF prog-id=79 op=LOAD Dec 16 02:08:36.517000 audit: BPF prog-id=57 op=UNLOAD Dec 16 02:08:36.520000 audit: BPF prog-id=80 op=LOAD Dec 16 02:08:36.520000 audit: BPF prog-id=60 op=UNLOAD Dec 16 02:08:36.520000 audit: BPF prog-id=81 op=LOAD Dec 16 02:08:36.520000 audit: BPF prog-id=82 op=LOAD Dec 16 02:08:36.520000 audit: BPF prog-id=61 op=UNLOAD Dec 16 02:08:36.520000 audit: BPF prog-id=62 op=UNLOAD Dec 16 02:08:36.535551 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 02:08:36.535654 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 02:08:36.535000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 02:08:36.536001 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:08:36.536075 systemd[1]: kubelet.service: Consumed 113ms CPU time, 95.1M memory peak. Dec 16 02:08:36.539328 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:08:36.693188 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:08:36.692000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:36.704396 (kubelet)[2448]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 02:08:36.752679 kubelet[2448]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 02:08:36.752679 kubelet[2448]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 02:08:36.752679 kubelet[2448]: I1216 02:08:36.751919 2448 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 02:08:37.612050 kubelet[2448]: I1216 02:08:37.611734 2448 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 02:08:37.612050 kubelet[2448]: I1216 02:08:37.611784 2448 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 02:08:37.614380 kubelet[2448]: I1216 02:08:37.614352 2448 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 02:08:37.614556 kubelet[2448]: I1216 02:08:37.614527 2448 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 02:08:37.615400 kubelet[2448]: I1216 02:08:37.615060 2448 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 02:08:37.623243 kubelet[2448]: I1216 02:08:37.623204 2448 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 02:08:37.623587 kubelet[2448]: E1216 02:08:37.623561 2448 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://49.13.61.135:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 49.13.61.135:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 02:08:37.628544 kubelet[2448]: I1216 02:08:37.628512 2448 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 02:08:37.631134 kubelet[2448]: I1216 02:08:37.631092 2448 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 02:08:37.631352 kubelet[2448]: I1216 02:08:37.631328 2448 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 02:08:37.631500 kubelet[2448]: I1216 02:08:37.631354 2448 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-9-be0981937a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 02:08:37.631593 kubelet[2448]: I1216 02:08:37.631502 2448 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 02:08:37.631593 kubelet[2448]: I1216 02:08:37.631513 2448 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 02:08:37.631633 kubelet[2448]: I1216 02:08:37.631625 2448 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 02:08:37.635522 kubelet[2448]: I1216 02:08:37.635494 2448 state_mem.go:36] "Initialized new in-memory state store" Dec 16 02:08:37.637295 kubelet[2448]: I1216 02:08:37.637268 2448 kubelet.go:475] "Attempting to sync node with API server" Dec 16 02:08:37.637357 kubelet[2448]: I1216 02:08:37.637301 2448 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 02:08:37.638053 kubelet[2448]: E1216 02:08:37.638001 2448 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://49.13.61.135:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-9-be0981937a&limit=500&resourceVersion=0\": dial tcp 49.13.61.135:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 02:08:37.639052 kubelet[2448]: I1216 02:08:37.638127 2448 kubelet.go:387] "Adding apiserver pod source" Dec 16 02:08:37.639052 kubelet[2448]: I1216 02:08:37.638176 2448 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 02:08:37.639789 kubelet[2448]: E1216 02:08:37.639758 2448 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://49.13.61.135:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 49.13.61.135:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 02:08:37.640339 kubelet[2448]: I1216 02:08:37.640313 2448 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 02:08:37.641558 kubelet[2448]: I1216 02:08:37.641529 2448 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 02:08:37.641686 kubelet[2448]: I1216 02:08:37.641673 2448 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 02:08:37.641797 kubelet[2448]: W1216 02:08:37.641785 2448 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 02:08:37.645004 kubelet[2448]: I1216 02:08:37.644987 2448 server.go:1262] "Started kubelet" Dec 16 02:08:37.646608 kubelet[2448]: I1216 02:08:37.646580 2448 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 02:08:37.650264 kubelet[2448]: I1216 02:08:37.650209 2448 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 02:08:37.652476 kubelet[2448]: I1216 02:08:37.652452 2448 server.go:310] "Adding debug handlers to kubelet server" Dec 16 02:08:37.654668 kubelet[2448]: I1216 02:08:37.654610 2448 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 02:08:37.654742 kubelet[2448]: I1216 02:08:37.654686 2448 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 02:08:37.655052 kubelet[2448]: I1216 02:08:37.654998 2448 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 02:08:37.656313 kubelet[2448]: I1216 02:08:37.656283 2448 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 02:08:37.658234 kubelet[2448]: E1216 02:08:37.651150 2448 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://49.13.61.135:6443/api/v1/namespaces/default/events\": dial tcp 49.13.61.135:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-9-be0981937a.1881900dc08636b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-9-be0981937a,UID:ci-4547-0-0-9-be0981937a,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-9-be0981937a,},FirstTimestamp:2025-12-16 02:08:37.64494098 +0000 UTC m=+0.935962651,LastTimestamp:2025-12-16 02:08:37.64494098 +0000 UTC m=+0.935962651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-9-be0981937a,}" Dec 16 02:08:37.658393 kubelet[2448]: I1216 02:08:37.658378 2448 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 02:08:37.659522 kubelet[2448]: I1216 02:08:37.659448 2448 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 02:08:37.659522 kubelet[2448]: E1216 02:08:37.658696 2448 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-9-be0981937a\" not found" Dec 16 02:08:37.659522 kubelet[2448]: I1216 02:08:37.659514 2448 reconciler.go:29] "Reconciler: start to sync state" Dec 16 02:08:37.661189 kubelet[2448]: I1216 02:08:37.660876 2448 factory.go:223] Registration of the systemd container factory successfully Dec 16 02:08:37.661189 kubelet[2448]: I1216 02:08:37.660975 2448 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 02:08:37.661476 kubelet[2448]: E1216 02:08:37.661438 2448 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.61.135:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-9-be0981937a?timeout=10s\": dial tcp 49.13.61.135:6443: connect: connection refused" interval="200ms" Dec 16 02:08:37.662733 kubelet[2448]: I1216 02:08:37.662707 2448 factory.go:223] Registration of the containerd container factory successfully Dec 16 02:08:37.664000 audit[2464]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2464 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:37.664000 audit[2464]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=fffffc8162e0 a2=0 a3=0 items=0 ppid=2448 pid=2464 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.664000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 02:08:37.666000 audit[2465]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2465 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:37.666000 audit[2465]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc5bfe5b0 a2=0 a3=0 items=0 ppid=2448 pid=2465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.666000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 02:08:37.669000 audit[2467]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2467 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:37.669000 audit[2467]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffeefa5370 a2=0 a3=0 items=0 ppid=2448 pid=2467 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.669000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 02:08:37.672000 audit[2469]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2469 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:37.672000 audit[2469]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff88907c0 a2=0 a3=0 items=0 ppid=2448 pid=2469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.672000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 02:08:37.683000 audit[2472]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2472 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:37.683000 audit[2472]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=fffff6ce5af0 a2=0 a3=0 items=0 ppid=2448 pid=2472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.683000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Dec 16 02:08:37.687247 kubelet[2448]: E1216 02:08:37.686578 2448 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://49.13.61.135:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 49.13.61.135:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 02:08:37.689894 kubelet[2448]: I1216 02:08:37.689780 2448 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 02:08:37.689000 audit[2476]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2476 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:37.689000 audit[2476]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffe2fc0270 a2=0 a3=0 items=0 ppid=2448 pid=2476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.689000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 02:08:37.691360 kubelet[2448]: E1216 02:08:37.690791 2448 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 02:08:37.691360 kubelet[2448]: I1216 02:08:37.691139 2448 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 02:08:37.691360 kubelet[2448]: I1216 02:08:37.691179 2448 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 02:08:37.691360 kubelet[2448]: I1216 02:08:37.691211 2448 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 02:08:37.691360 kubelet[2448]: E1216 02:08:37.691256 2448 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 02:08:37.693744 kubelet[2448]: E1216 02:08:37.693692 2448 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://49.13.61.135:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 49.13.61.135:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 02:08:37.693000 audit[2478]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2478 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:37.693000 audit[2478]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd4a43530 a2=0 a3=0 items=0 ppid=2448 pid=2478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.693000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 02:08:37.695000 audit[2477]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2477 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:37.695000 audit[2477]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffce2b8320 a2=0 a3=0 items=0 ppid=2448 pid=2477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.695000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 02:08:37.697818 kubelet[2448]: I1216 02:08:37.697743 2448 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 02:08:37.696000 audit[2481]: NETFILTER_CFG table=nat:50 family=2 entries=1 op=nft_register_chain pid=2481 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:37.696000 audit[2481]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffda505650 a2=0 a3=0 items=0 ppid=2448 pid=2481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.696000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 02:08:37.698140 kubelet[2448]: I1216 02:08:37.698072 2448 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 02:08:37.698140 kubelet[2448]: I1216 02:08:37.698097 2448 state_mem.go:36] "Initialized new in-memory state store" Dec 16 02:08:37.698000 audit[2482]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_chain pid=2482 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:37.698000 audit[2482]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe6214330 a2=0 a3=0 items=0 ppid=2448 pid=2482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.698000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 02:08:37.700842 kubelet[2448]: I1216 02:08:37.700822 2448 policy_none.go:49] "None policy: Start" Dec 16 02:08:37.701637 kubelet[2448]: I1216 02:08:37.701601 2448 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 02:08:37.701704 kubelet[2448]: I1216 02:08:37.701647 2448 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 02:08:37.701000 audit[2483]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2483 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:37.703250 kubelet[2448]: I1216 02:08:37.703137 2448 policy_none.go:47] "Start" Dec 16 02:08:37.701000 audit[2483]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffdd141010 a2=0 a3=0 items=0 ppid=2448 pid=2483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.701000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 02:08:37.702000 audit[2484]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2484 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:37.702000 audit[2484]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd1567e70 a2=0 a3=0 items=0 ppid=2448 pid=2484 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:37.702000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 02:08:37.709402 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 02:08:37.725994 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 02:08:37.732724 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 02:08:37.742478 kubelet[2448]: E1216 02:08:37.741664 2448 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 02:08:37.742478 kubelet[2448]: I1216 02:08:37.742075 2448 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 02:08:37.742478 kubelet[2448]: I1216 02:08:37.742093 2448 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 02:08:37.742478 kubelet[2448]: I1216 02:08:37.742387 2448 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 02:08:37.743688 kubelet[2448]: E1216 02:08:37.743661 2448 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 02:08:37.743881 kubelet[2448]: E1216 02:08:37.743862 2448 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-9-be0981937a\" not found" Dec 16 02:08:37.806543 systemd[1]: Created slice kubepods-burstable-podfc13fbcaf0d55aea76ba67ff2904fc40.slice - libcontainer container kubepods-burstable-podfc13fbcaf0d55aea76ba67ff2904fc40.slice. Dec 16 02:08:37.834686 kubelet[2448]: E1216 02:08:37.834641 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.840056 systemd[1]: Created slice kubepods-burstable-pod0d0bca379f544505ddc4ae259ee17262.slice - libcontainer container kubepods-burstable-pod0d0bca379f544505ddc4ae259ee17262.slice. Dec 16 02:08:37.844941 kubelet[2448]: E1216 02:08:37.844597 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.845449 kubelet[2448]: I1216 02:08:37.844734 2448 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.845803 kubelet[2448]: E1216 02:08:37.845765 2448 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.13.61.135:6443/api/v1/nodes\": dial tcp 49.13.61.135:6443: connect: connection refused" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.850006 systemd[1]: Created slice kubepods-burstable-pod883466bef02aa67af54c1ed2b3d23367.slice - libcontainer container kubepods-burstable-pod883466bef02aa67af54c1ed2b3d23367.slice. Dec 16 02:08:37.852347 kubelet[2448]: E1216 02:08:37.852305 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.862590 kubelet[2448]: E1216 02:08:37.862401 2448 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.61.135:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-9-be0981937a?timeout=10s\": dial tcp 49.13.61.135:6443: connect: connection refused" interval="400ms" Dec 16 02:08:37.961514 kubelet[2448]: I1216 02:08:37.961294 2448 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.961514 kubelet[2448]: I1216 02:08:37.961418 2448 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.961514 kubelet[2448]: I1216 02:08:37.961508 2448 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.961908 kubelet[2448]: I1216 02:08:37.961556 2448 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0d0bca379f544505ddc4ae259ee17262-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-9-be0981937a\" (UID: \"0d0bca379f544505ddc4ae259ee17262\") " pod="kube-system/kube-scheduler-ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.961908 kubelet[2448]: I1216 02:08:37.961585 2448 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/883466bef02aa67af54c1ed2b3d23367-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-9-be0981937a\" (UID: \"883466bef02aa67af54c1ed2b3d23367\") " pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.961908 kubelet[2448]: I1216 02:08:37.961612 2448 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.961908 kubelet[2448]: I1216 02:08:37.961651 2448 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.961908 kubelet[2448]: I1216 02:08:37.961671 2448 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/883466bef02aa67af54c1ed2b3d23367-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-9-be0981937a\" (UID: \"883466bef02aa67af54c1ed2b3d23367\") " pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:37.962061 kubelet[2448]: I1216 02:08:37.961717 2448 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/883466bef02aa67af54c1ed2b3d23367-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-9-be0981937a\" (UID: \"883466bef02aa67af54c1ed2b3d23367\") " pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:38.048919 kubelet[2448]: I1216 02:08:38.048823 2448 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:38.049630 kubelet[2448]: E1216 02:08:38.049553 2448 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.13.61.135:6443/api/v1/nodes\": dial tcp 49.13.61.135:6443: connect: connection refused" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:38.140570 containerd[1587]: time="2025-12-16T02:08:38.140415041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-9-be0981937a,Uid:fc13fbcaf0d55aea76ba67ff2904fc40,Namespace:kube-system,Attempt:0,}" Dec 16 02:08:38.148953 containerd[1587]: time="2025-12-16T02:08:38.148861296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-9-be0981937a,Uid:0d0bca379f544505ddc4ae259ee17262,Namespace:kube-system,Attempt:0,}" Dec 16 02:08:38.156200 containerd[1587]: time="2025-12-16T02:08:38.156074476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-9-be0981937a,Uid:883466bef02aa67af54c1ed2b3d23367,Namespace:kube-system,Attempt:0,}" Dec 16 02:08:38.262963 kubelet[2448]: E1216 02:08:38.262888 2448 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.61.135:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-9-be0981937a?timeout=10s\": dial tcp 49.13.61.135:6443: connect: connection refused" interval="800ms" Dec 16 02:08:38.292434 kubelet[2448]: E1216 02:08:38.292222 2448 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://49.13.61.135:6443/api/v1/namespaces/default/events\": dial tcp 49.13.61.135:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-9-be0981937a.1881900dc08636b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-9-be0981937a,UID:ci-4547-0-0-9-be0981937a,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-9-be0981937a,},FirstTimestamp:2025-12-16 02:08:37.64494098 +0000 UTC m=+0.935962651,LastTimestamp:2025-12-16 02:08:37.64494098 +0000 UTC m=+0.935962651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-9-be0981937a,}" Dec 16 02:08:38.453001 kubelet[2448]: I1216 02:08:38.452930 2448 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:38.453607 kubelet[2448]: E1216 02:08:38.453575 2448 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.13.61.135:6443/api/v1/nodes\": dial tcp 49.13.61.135:6443: connect: connection refused" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:38.609939 kubelet[2448]: E1216 02:08:38.609864 2448 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://49.13.61.135:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-9-be0981937a&limit=500&resourceVersion=0\": dial tcp 49.13.61.135:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 02:08:38.628633 kubelet[2448]: E1216 02:08:38.628584 2448 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://49.13.61.135:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 49.13.61.135:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 02:08:38.672508 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2029280225.mount: Deactivated successfully. Dec 16 02:08:38.680910 containerd[1587]: time="2025-12-16T02:08:38.680202176Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 02:08:38.683181 containerd[1587]: time="2025-12-16T02:08:38.683121415Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 02:08:38.688472 containerd[1587]: time="2025-12-16T02:08:38.688410187Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 02:08:38.690540 containerd[1587]: time="2025-12-16T02:08:38.690274503Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 02:08:38.697411 kubelet[2448]: E1216 02:08:38.696349 2448 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://49.13.61.135:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 49.13.61.135:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 02:08:38.697550 containerd[1587]: time="2025-12-16T02:08:38.697072364Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 02:08:38.699634 containerd[1587]: time="2025-12-16T02:08:38.699588445Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 02:08:38.702113 containerd[1587]: time="2025-12-16T02:08:38.700979951Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 02:08:38.702238 containerd[1587]: time="2025-12-16T02:08:38.701492569Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 02:08:38.702354 containerd[1587]: time="2025-12-16T02:08:38.702335890Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 557.866715ms" Dec 16 02:08:38.708887 containerd[1587]: time="2025-12-16T02:08:38.708763800Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 547.38927ms" Dec 16 02:08:38.711935 containerd[1587]: time="2025-12-16T02:08:38.711884637Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 561.03208ms" Dec 16 02:08:38.731372 containerd[1587]: time="2025-12-16T02:08:38.731248741Z" level=info msg="connecting to shim 8c83ba9049ba260e396bf8e549021049793142013d799e542077147dc6e8f64d" address="unix:///run/containerd/s/cddd11251b03c2bd2e00c0064853b37dba3f2d59e2930e4aa6214688538ce9ce" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:08:38.756312 containerd[1587]: time="2025-12-16T02:08:38.756252044Z" level=info msg="connecting to shim 04a2c7ccaab3a550be339fd6a88deab003434d18e95a41f1a871b330e83c2bd8" address="unix:///run/containerd/s/900110dbf8debdf2ec68451fe5796047409ab1819d0ec008fbe8cf8f04f476e7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:08:38.771320 systemd[1]: Started cri-containerd-8c83ba9049ba260e396bf8e549021049793142013d799e542077147dc6e8f64d.scope - libcontainer container 8c83ba9049ba260e396bf8e549021049793142013d799e542077147dc6e8f64d. Dec 16 02:08:38.775394 containerd[1587]: time="2025-12-16T02:08:38.774438763Z" level=info msg="connecting to shim e757afd2abf42cedaeb74b1fa1e26e335eea6b95440422972fad649d0717bfad" address="unix:///run/containerd/s/badc094ce3b44938cc21b53cda76c6c889394f1a7a2880b313859a54a4649554" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:08:38.787271 systemd[1]: Started cri-containerd-04a2c7ccaab3a550be339fd6a88deab003434d18e95a41f1a871b330e83c2bd8.scope - libcontainer container 04a2c7ccaab3a550be339fd6a88deab003434d18e95a41f1a871b330e83c2bd8. Dec 16 02:08:38.797000 audit: BPF prog-id=83 op=LOAD Dec 16 02:08:38.798000 audit: BPF prog-id=84 op=LOAD Dec 16 02:08:38.798000 audit[2508]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2496 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.798000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863383362613930343962613236306533393662663865353439303231 Dec 16 02:08:38.798000 audit: BPF prog-id=84 op=UNLOAD Dec 16 02:08:38.798000 audit[2508]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2496 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.798000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863383362613930343962613236306533393662663865353439303231 Dec 16 02:08:38.800000 audit: BPF prog-id=85 op=LOAD Dec 16 02:08:38.800000 audit[2508]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2496 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.800000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863383362613930343962613236306533393662663865353439303231 Dec 16 02:08:38.800000 audit: BPF prog-id=86 op=LOAD Dec 16 02:08:38.800000 audit[2508]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2496 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.800000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863383362613930343962613236306533393662663865353439303231 Dec 16 02:08:38.803000 audit: BPF prog-id=86 op=UNLOAD Dec 16 02:08:38.803000 audit[2508]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2496 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.803000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863383362613930343962613236306533393662663865353439303231 Dec 16 02:08:38.803000 audit: BPF prog-id=85 op=UNLOAD Dec 16 02:08:38.803000 audit[2508]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2496 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.803000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863383362613930343962613236306533393662663865353439303231 Dec 16 02:08:38.803000 audit: BPF prog-id=87 op=LOAD Dec 16 02:08:38.803000 audit[2508]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2496 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.803000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863383362613930343962613236306533393662663865353439303231 Dec 16 02:08:38.816377 systemd[1]: Started cri-containerd-e757afd2abf42cedaeb74b1fa1e26e335eea6b95440422972fad649d0717bfad.scope - libcontainer container e757afd2abf42cedaeb74b1fa1e26e335eea6b95440422972fad649d0717bfad. Dec 16 02:08:38.822000 audit: BPF prog-id=88 op=LOAD Dec 16 02:08:38.822000 audit: BPF prog-id=89 op=LOAD Dec 16 02:08:38.822000 audit[2549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2529 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.822000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034613263376363616162336135353062653333396664366138386465 Dec 16 02:08:38.822000 audit: BPF prog-id=89 op=UNLOAD Dec 16 02:08:38.822000 audit[2549]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.822000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034613263376363616162336135353062653333396664366138386465 Dec 16 02:08:38.823000 audit: BPF prog-id=90 op=LOAD Dec 16 02:08:38.823000 audit[2549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2529 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.823000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034613263376363616162336135353062653333396664366138386465 Dec 16 02:08:38.823000 audit: BPF prog-id=91 op=LOAD Dec 16 02:08:38.823000 audit[2549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2529 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.823000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034613263376363616162336135353062653333396664366138386465 Dec 16 02:08:38.824000 audit: BPF prog-id=91 op=UNLOAD Dec 16 02:08:38.824000 audit[2549]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.824000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034613263376363616162336135353062653333396664366138386465 Dec 16 02:08:38.824000 audit: BPF prog-id=90 op=UNLOAD Dec 16 02:08:38.824000 audit[2549]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.824000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034613263376363616162336135353062653333396664366138386465 Dec 16 02:08:38.824000 audit: BPF prog-id=92 op=LOAD Dec 16 02:08:38.824000 audit[2549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2529 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.824000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034613263376363616162336135353062653333396664366138386465 Dec 16 02:08:38.838000 audit: BPF prog-id=93 op=LOAD Dec 16 02:08:38.838000 audit: BPF prog-id=94 op=LOAD Dec 16 02:08:38.838000 audit[2581]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2551 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.838000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353761666432616266343263656461656237346231666131653236 Dec 16 02:08:38.839000 audit: BPF prog-id=94 op=UNLOAD Dec 16 02:08:38.839000 audit[2581]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2551 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353761666432616266343263656461656237346231666131653236 Dec 16 02:08:38.839000 audit: BPF prog-id=95 op=LOAD Dec 16 02:08:38.839000 audit[2581]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2551 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353761666432616266343263656461656237346231666131653236 Dec 16 02:08:38.839000 audit: BPF prog-id=96 op=LOAD Dec 16 02:08:38.839000 audit[2581]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2551 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353761666432616266343263656461656237346231666131653236 Dec 16 02:08:38.839000 audit: BPF prog-id=96 op=UNLOAD Dec 16 02:08:38.839000 audit[2581]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2551 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353761666432616266343263656461656237346231666131653236 Dec 16 02:08:38.839000 audit: BPF prog-id=95 op=UNLOAD Dec 16 02:08:38.839000 audit[2581]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2551 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353761666432616266343263656461656237346231666131653236 Dec 16 02:08:38.839000 audit: BPF prog-id=97 op=LOAD Dec 16 02:08:38.839000 audit[2581]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2551 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537353761666432616266343263656461656237346231666131653236 Dec 16 02:08:38.871251 containerd[1587]: time="2025-12-16T02:08:38.870818119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-9-be0981937a,Uid:fc13fbcaf0d55aea76ba67ff2904fc40,Namespace:kube-system,Attempt:0,} returns sandbox id \"8c83ba9049ba260e396bf8e549021049793142013d799e542077147dc6e8f64d\"" Dec 16 02:08:38.875659 containerd[1587]: time="2025-12-16T02:08:38.875611676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-9-be0981937a,Uid:883466bef02aa67af54c1ed2b3d23367,Namespace:kube-system,Attempt:0,} returns sandbox id \"04a2c7ccaab3a550be339fd6a88deab003434d18e95a41f1a871b330e83c2bd8\"" Dec 16 02:08:38.882472 containerd[1587]: time="2025-12-16T02:08:38.882425700Z" level=info msg="CreateContainer within sandbox \"8c83ba9049ba260e396bf8e549021049793142013d799e542077147dc6e8f64d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 02:08:38.885611 containerd[1587]: time="2025-12-16T02:08:38.885458840Z" level=info msg="CreateContainer within sandbox \"04a2c7ccaab3a550be339fd6a88deab003434d18e95a41f1a871b330e83c2bd8\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 02:08:38.885998 containerd[1587]: time="2025-12-16T02:08:38.885743134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-9-be0981937a,Uid:0d0bca379f544505ddc4ae259ee17262,Namespace:kube-system,Attempt:0,} returns sandbox id \"e757afd2abf42cedaeb74b1fa1e26e335eea6b95440422972fad649d0717bfad\"" Dec 16 02:08:38.891128 containerd[1587]: time="2025-12-16T02:08:38.891086517Z" level=info msg="CreateContainer within sandbox \"e757afd2abf42cedaeb74b1fa1e26e335eea6b95440422972fad649d0717bfad\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 02:08:38.894935 containerd[1587]: time="2025-12-16T02:08:38.894894445Z" level=info msg="Container ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:08:38.896615 containerd[1587]: time="2025-12-16T02:08:38.896566725Z" level=info msg="Container 99c513cd059cb891e747a3c38ace6984ea2d229694d258a7fc8254dec9c585a0: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:08:38.905625 containerd[1587]: time="2025-12-16T02:08:38.905453265Z" level=info msg="CreateContainer within sandbox \"04a2c7ccaab3a550be339fd6a88deab003434d18e95a41f1a871b330e83c2bd8\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"99c513cd059cb891e747a3c38ace6984ea2d229694d258a7fc8254dec9c585a0\"" Dec 16 02:08:38.906510 containerd[1587]: time="2025-12-16T02:08:38.906462858Z" level=info msg="StartContainer for \"99c513cd059cb891e747a3c38ace6984ea2d229694d258a7fc8254dec9c585a0\"" Dec 16 02:08:38.910718 containerd[1587]: time="2025-12-16T02:08:38.910658140Z" level=info msg="connecting to shim 99c513cd059cb891e747a3c38ace6984ea2d229694d258a7fc8254dec9c585a0" address="unix:///run/containerd/s/900110dbf8debdf2ec68451fe5796047409ab1819d0ec008fbe8cf8f04f476e7" protocol=ttrpc version=3 Dec 16 02:08:38.911283 containerd[1587]: time="2025-12-16T02:08:38.911253214Z" level=info msg="Container 0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:08:38.913797 containerd[1587]: time="2025-12-16T02:08:38.913747691Z" level=info msg="CreateContainer within sandbox \"8c83ba9049ba260e396bf8e549021049793142013d799e542077147dc6e8f64d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361\"" Dec 16 02:08:38.916566 containerd[1587]: time="2025-12-16T02:08:38.916485215Z" level=info msg="StartContainer for \"ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361\"" Dec 16 02:08:38.920372 containerd[1587]: time="2025-12-16T02:08:38.920245214Z" level=info msg="connecting to shim ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361" address="unix:///run/containerd/s/cddd11251b03c2bd2e00c0064853b37dba3f2d59e2930e4aa6214688538ce9ce" protocol=ttrpc version=3 Dec 16 02:08:38.931244 containerd[1587]: time="2025-12-16T02:08:38.931201350Z" level=info msg="CreateContainer within sandbox \"e757afd2abf42cedaeb74b1fa1e26e335eea6b95440422972fad649d0717bfad\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96\"" Dec 16 02:08:38.934948 containerd[1587]: time="2025-12-16T02:08:38.934887015Z" level=info msg="StartContainer for \"0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96\"" Dec 16 02:08:38.936577 kubelet[2448]: E1216 02:08:38.936504 2448 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://49.13.61.135:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 49.13.61.135:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 02:08:38.942800 containerd[1587]: time="2025-12-16T02:08:38.942665983Z" level=info msg="connecting to shim 0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96" address="unix:///run/containerd/s/badc094ce3b44938cc21b53cda76c6c889394f1a7a2880b313859a54a4649554" protocol=ttrpc version=3 Dec 16 02:08:38.947350 systemd[1]: Started cri-containerd-99c513cd059cb891e747a3c38ace6984ea2d229694d258a7fc8254dec9c585a0.scope - libcontainer container 99c513cd059cb891e747a3c38ace6984ea2d229694d258a7fc8254dec9c585a0. Dec 16 02:08:38.954322 systemd[1]: Started cri-containerd-ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361.scope - libcontainer container ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361. Dec 16 02:08:38.972311 systemd[1]: Started cri-containerd-0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96.scope - libcontainer container 0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96. Dec 16 02:08:38.984000 audit: BPF prog-id=98 op=LOAD Dec 16 02:08:38.986000 audit: BPF prog-id=99 op=LOAD Dec 16 02:08:38.986000 audit[2632]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2529 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939633531336364303539636238393165373437613363333861636536 Dec 16 02:08:38.986000 audit: BPF prog-id=99 op=UNLOAD Dec 16 02:08:38.986000 audit[2632]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939633531336364303539636238393165373437613363333861636536 Dec 16 02:08:38.986000 audit: BPF prog-id=100 op=LOAD Dec 16 02:08:38.986000 audit[2632]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2529 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939633531336364303539636238393165373437613363333861636536 Dec 16 02:08:38.986000 audit: BPF prog-id=101 op=LOAD Dec 16 02:08:38.986000 audit[2632]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2529 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939633531336364303539636238393165373437613363333861636536 Dec 16 02:08:38.986000 audit: BPF prog-id=101 op=UNLOAD Dec 16 02:08:38.986000 audit[2632]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939633531336364303539636238393165373437613363333861636536 Dec 16 02:08:38.986000 audit: BPF prog-id=100 op=UNLOAD Dec 16 02:08:38.986000 audit[2632]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2529 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939633531336364303539636238393165373437613363333861636536 Dec 16 02:08:38.986000 audit: BPF prog-id=102 op=LOAD Dec 16 02:08:38.986000 audit: BPF prog-id=103 op=LOAD Dec 16 02:08:38.986000 audit[2632]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2529 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939633531336364303539636238393165373437613363333861636536 Dec 16 02:08:38.987000 audit: BPF prog-id=104 op=LOAD Dec 16 02:08:38.987000 audit[2633]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2496 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162336136633166646338346463326138393562326531316334303131 Dec 16 02:08:38.988000 audit: BPF prog-id=104 op=UNLOAD Dec 16 02:08:38.988000 audit[2633]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2496 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162336136633166646338346463326138393562326531316334303131 Dec 16 02:08:38.988000 audit: BPF prog-id=105 op=LOAD Dec 16 02:08:38.988000 audit[2633]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2496 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162336136633166646338346463326138393562326531316334303131 Dec 16 02:08:38.988000 audit: BPF prog-id=106 op=LOAD Dec 16 02:08:38.988000 audit[2633]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=2496 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162336136633166646338346463326138393562326531316334303131 Dec 16 02:08:38.988000 audit: BPF prog-id=106 op=UNLOAD Dec 16 02:08:38.988000 audit[2633]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2496 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162336136633166646338346463326138393562326531316334303131 Dec 16 02:08:38.988000 audit: BPF prog-id=105 op=UNLOAD Dec 16 02:08:38.988000 audit[2633]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2496 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162336136633166646338346463326138393562326531316334303131 Dec 16 02:08:38.988000 audit: BPF prog-id=107 op=LOAD Dec 16 02:08:38.988000 audit[2633]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=2496 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:38.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162336136633166646338346463326138393562326531316334303131 Dec 16 02:08:39.004000 audit: BPF prog-id=108 op=LOAD Dec 16 02:08:39.006000 audit: BPF prog-id=109 op=LOAD Dec 16 02:08:39.006000 audit[2654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2551 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:39.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066633165306534303839306633623561646139626336343939653837 Dec 16 02:08:39.006000 audit: BPF prog-id=109 op=UNLOAD Dec 16 02:08:39.006000 audit[2654]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2551 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:39.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066633165306534303839306633623561646139626336343939653837 Dec 16 02:08:39.006000 audit: BPF prog-id=110 op=LOAD Dec 16 02:08:39.006000 audit[2654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2551 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:39.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066633165306534303839306633623561646139626336343939653837 Dec 16 02:08:39.006000 audit: BPF prog-id=111 op=LOAD Dec 16 02:08:39.006000 audit[2654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2551 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:39.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066633165306534303839306633623561646139626336343939653837 Dec 16 02:08:39.006000 audit: BPF prog-id=111 op=UNLOAD Dec 16 02:08:39.006000 audit[2654]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2551 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:39.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066633165306534303839306633623561646139626336343939653837 Dec 16 02:08:39.006000 audit: BPF prog-id=110 op=UNLOAD Dec 16 02:08:39.006000 audit[2654]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2551 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:39.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066633165306534303839306633623561646139626336343939653837 Dec 16 02:08:39.006000 audit: BPF prog-id=112 op=LOAD Dec 16 02:08:39.006000 audit[2654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2551 pid=2654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:39.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066633165306534303839306633623561646139626336343939653837 Dec 16 02:08:39.052406 containerd[1587]: time="2025-12-16T02:08:39.052354257Z" level=info msg="StartContainer for \"ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361\" returns successfully" Dec 16 02:08:39.064456 kubelet[2448]: E1216 02:08:39.063461 2448 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.61.135:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-9-be0981937a?timeout=10s\": dial tcp 49.13.61.135:6443: connect: connection refused" interval="1.6s" Dec 16 02:08:39.066108 containerd[1587]: time="2025-12-16T02:08:39.064757234Z" level=info msg="StartContainer for \"99c513cd059cb891e747a3c38ace6984ea2d229694d258a7fc8254dec9c585a0\" returns successfully" Dec 16 02:08:39.066924 containerd[1587]: time="2025-12-16T02:08:39.066838540Z" level=info msg="StartContainer for \"0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96\" returns successfully" Dec 16 02:08:39.256571 kubelet[2448]: I1216 02:08:39.256062 2448 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:39.709685 kubelet[2448]: E1216 02:08:39.709647 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:39.716054 kubelet[2448]: E1216 02:08:39.715504 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:39.719898 kubelet[2448]: E1216 02:08:39.719870 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:40.720846 kubelet[2448]: E1216 02:08:40.720567 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:40.720846 kubelet[2448]: E1216 02:08:40.720584 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:41.682409 kubelet[2448]: E1216 02:08:41.682299 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:41.724965 kubelet[2448]: E1216 02:08:41.724926 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:41.725317 kubelet[2448]: E1216 02:08:41.725258 2448 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:42.271491 kubelet[2448]: E1216 02:08:42.271447 2448 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-0-0-9-be0981937a\" not found" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:42.354591 kubelet[2448]: I1216 02:08:42.354529 2448 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:42.360012 kubelet[2448]: I1216 02:08:42.359897 2448 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:42.418045 kubelet[2448]: E1216 02:08:42.416815 2448 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:42.418045 kubelet[2448]: I1216 02:08:42.416853 2448 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-9-be0981937a" Dec 16 02:08:42.425326 kubelet[2448]: E1216 02:08:42.425286 2448 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-9-be0981937a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-9-be0981937a" Dec 16 02:08:42.425778 kubelet[2448]: I1216 02:08:42.425475 2448 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:42.434588 kubelet[2448]: E1216 02:08:42.434554 2448 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-9-be0981937a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:42.642869 kubelet[2448]: I1216 02:08:42.642718 2448 apiserver.go:52] "Watching apiserver" Dec 16 02:08:42.660108 kubelet[2448]: I1216 02:08:42.660064 2448 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 02:08:44.669886 systemd[1]: Reload requested from client PID 2734 ('systemctl') (unit session-8.scope)... Dec 16 02:08:44.669911 systemd[1]: Reloading... Dec 16 02:08:44.794088 zram_generator::config[2784]: No configuration found. Dec 16 02:08:45.013919 systemd[1]: Reloading finished in 343 ms. Dec 16 02:08:45.049896 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:08:45.050255 kubelet[2448]: I1216 02:08:45.049895 2448 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 02:08:45.064734 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 02:08:45.065218 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:08:45.064000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:45.065325 systemd[1]: kubelet.service: Consumed 1.416s CPU time, 121.5M memory peak. Dec 16 02:08:45.068150 kernel: kauditd_printk_skb: 202 callbacks suppressed Dec 16 02:08:45.068218 kernel: audit: type=1131 audit(1765850925.064:388): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:45.068000 audit: BPF prog-id=113 op=LOAD Dec 16 02:08:45.068000 audit: BPF prog-id=66 op=UNLOAD Dec 16 02:08:45.069000 audit: BPF prog-id=114 op=LOAD Dec 16 02:08:45.068747 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 02:08:45.071530 kernel: audit: type=1334 audit(1765850925.068:389): prog-id=113 op=LOAD Dec 16 02:08:45.071559 kernel: audit: type=1334 audit(1765850925.068:390): prog-id=66 op=UNLOAD Dec 16 02:08:45.071588 kernel: audit: type=1334 audit(1765850925.069:391): prog-id=114 op=LOAD Dec 16 02:08:45.069000 audit: BPF prog-id=115 op=LOAD Dec 16 02:08:45.070000 audit: BPF prog-id=67 op=UNLOAD Dec 16 02:08:45.074448 kernel: audit: type=1334 audit(1765850925.069:392): prog-id=115 op=LOAD Dec 16 02:08:45.074495 kernel: audit: type=1334 audit(1765850925.070:393): prog-id=67 op=UNLOAD Dec 16 02:08:45.070000 audit: BPF prog-id=68 op=UNLOAD Dec 16 02:08:45.071000 audit: BPF prog-id=116 op=LOAD Dec 16 02:08:45.076586 kernel: audit: type=1334 audit(1765850925.070:394): prog-id=68 op=UNLOAD Dec 16 02:08:45.076642 kernel: audit: type=1334 audit(1765850925.071:395): prog-id=116 op=LOAD Dec 16 02:08:45.076664 kernel: audit: type=1334 audit(1765850925.071:396): prog-id=80 op=UNLOAD Dec 16 02:08:45.071000 audit: BPF prog-id=80 op=UNLOAD Dec 16 02:08:45.071000 audit: BPF prog-id=117 op=LOAD Dec 16 02:08:45.071000 audit: BPF prog-id=118 op=LOAD Dec 16 02:08:45.071000 audit: BPF prog-id=81 op=UNLOAD Dec 16 02:08:45.071000 audit: BPF prog-id=82 op=UNLOAD Dec 16 02:08:45.073000 audit: BPF prog-id=119 op=LOAD Dec 16 02:08:45.073000 audit: BPF prog-id=72 op=UNLOAD Dec 16 02:08:45.073000 audit: BPF prog-id=120 op=LOAD Dec 16 02:08:45.073000 audit: BPF prog-id=121 op=LOAD Dec 16 02:08:45.073000 audit: BPF prog-id=73 op=UNLOAD Dec 16 02:08:45.073000 audit: BPF prog-id=74 op=UNLOAD Dec 16 02:08:45.074000 audit: BPF prog-id=122 op=LOAD Dec 16 02:08:45.074000 audit: BPF prog-id=76 op=UNLOAD Dec 16 02:08:45.078990 kernel: audit: type=1334 audit(1765850925.071:397): prog-id=117 op=LOAD Dec 16 02:08:45.076000 audit: BPF prog-id=123 op=LOAD Dec 16 02:08:45.076000 audit: BPF prog-id=69 op=UNLOAD Dec 16 02:08:45.076000 audit: BPF prog-id=124 op=LOAD Dec 16 02:08:45.076000 audit: BPF prog-id=125 op=LOAD Dec 16 02:08:45.077000 audit: BPF prog-id=70 op=UNLOAD Dec 16 02:08:45.077000 audit: BPF prog-id=71 op=UNLOAD Dec 16 02:08:45.077000 audit: BPF prog-id=126 op=LOAD Dec 16 02:08:45.077000 audit: BPF prog-id=63 op=UNLOAD Dec 16 02:08:45.077000 audit: BPF prog-id=127 op=LOAD Dec 16 02:08:45.077000 audit: BPF prog-id=128 op=LOAD Dec 16 02:08:45.077000 audit: BPF prog-id=64 op=UNLOAD Dec 16 02:08:45.077000 audit: BPF prog-id=65 op=UNLOAD Dec 16 02:08:45.082000 audit: BPF prog-id=129 op=LOAD Dec 16 02:08:45.082000 audit: BPF prog-id=75 op=UNLOAD Dec 16 02:08:45.082000 audit: BPF prog-id=130 op=LOAD Dec 16 02:08:45.083000 audit: BPF prog-id=131 op=LOAD Dec 16 02:08:45.083000 audit: BPF prog-id=77 op=UNLOAD Dec 16 02:08:45.083000 audit: BPF prog-id=78 op=UNLOAD Dec 16 02:08:45.084000 audit: BPF prog-id=132 op=LOAD Dec 16 02:08:45.084000 audit: BPF prog-id=79 op=UNLOAD Dec 16 02:08:45.261470 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 02:08:45.261000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:08:45.275503 (kubelet)[2826]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 02:08:45.344460 kubelet[2826]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 02:08:45.345140 kubelet[2826]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 02:08:45.345140 kubelet[2826]: I1216 02:08:45.344881 2826 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 02:08:45.354058 kubelet[2826]: I1216 02:08:45.353148 2826 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 02:08:45.354058 kubelet[2826]: I1216 02:08:45.353184 2826 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 02:08:45.354058 kubelet[2826]: I1216 02:08:45.353216 2826 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 02:08:45.354058 kubelet[2826]: I1216 02:08:45.353223 2826 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 02:08:45.354058 kubelet[2826]: I1216 02:08:45.353471 2826 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 02:08:45.355287 kubelet[2826]: I1216 02:08:45.355264 2826 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 02:08:45.357690 kubelet[2826]: I1216 02:08:45.357661 2826 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 02:08:45.366044 kubelet[2826]: I1216 02:08:45.366000 2826 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 02:08:45.368302 kubelet[2826]: I1216 02:08:45.368277 2826 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 02:08:45.368483 kubelet[2826]: I1216 02:08:45.368456 2826 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 02:08:45.368633 kubelet[2826]: I1216 02:08:45.368481 2826 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-9-be0981937a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 02:08:45.368725 kubelet[2826]: I1216 02:08:45.368636 2826 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 02:08:45.368725 kubelet[2826]: I1216 02:08:45.368644 2826 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 02:08:45.368725 kubelet[2826]: I1216 02:08:45.368669 2826 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 02:08:45.369679 kubelet[2826]: I1216 02:08:45.369657 2826 state_mem.go:36] "Initialized new in-memory state store" Dec 16 02:08:45.369868 kubelet[2826]: I1216 02:08:45.369817 2826 kubelet.go:475] "Attempting to sync node with API server" Dec 16 02:08:45.369868 kubelet[2826]: I1216 02:08:45.369839 2826 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 02:08:45.369868 kubelet[2826]: I1216 02:08:45.369865 2826 kubelet.go:387] "Adding apiserver pod source" Dec 16 02:08:45.370080 kubelet[2826]: I1216 02:08:45.369878 2826 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 02:08:45.373707 kubelet[2826]: I1216 02:08:45.373660 2826 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 02:08:45.375404 kubelet[2826]: I1216 02:08:45.375333 2826 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 02:08:45.375404 kubelet[2826]: I1216 02:08:45.375373 2826 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 02:08:45.382290 kubelet[2826]: I1216 02:08:45.382263 2826 server.go:1262] "Started kubelet" Dec 16 02:08:45.385844 kubelet[2826]: I1216 02:08:45.385559 2826 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 02:08:45.389527 kubelet[2826]: I1216 02:08:45.389286 2826 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 02:08:45.390588 kubelet[2826]: I1216 02:08:45.390569 2826 server.go:310] "Adding debug handlers to kubelet server" Dec 16 02:08:45.394523 kubelet[2826]: I1216 02:08:45.394466 2826 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 02:08:45.395061 kubelet[2826]: I1216 02:08:45.394647 2826 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 02:08:45.395061 kubelet[2826]: I1216 02:08:45.394830 2826 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 02:08:45.395333 kubelet[2826]: I1216 02:08:45.395315 2826 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 02:08:45.398330 kubelet[2826]: E1216 02:08:45.398304 2826 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 02:08:45.400711 kubelet[2826]: I1216 02:08:45.400693 2826 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 02:08:45.401522 kubelet[2826]: E1216 02:08:45.401497 2826 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-9-be0981937a\" not found" Dec 16 02:08:45.402085 kubelet[2826]: I1216 02:08:45.402066 2826 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 02:08:45.402253 kubelet[2826]: I1216 02:08:45.402241 2826 reconciler.go:29] "Reconciler: start to sync state" Dec 16 02:08:45.407375 kubelet[2826]: I1216 02:08:45.406588 2826 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 02:08:45.411910 kubelet[2826]: I1216 02:08:45.411876 2826 factory.go:223] Registration of the containerd container factory successfully Dec 16 02:08:45.420241 kubelet[2826]: I1216 02:08:45.420194 2826 factory.go:223] Registration of the systemd container factory successfully Dec 16 02:08:45.444884 kubelet[2826]: I1216 02:08:45.444833 2826 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 02:08:45.447215 kubelet[2826]: I1216 02:08:45.447096 2826 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 02:08:45.447215 kubelet[2826]: I1216 02:08:45.447206 2826 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 02:08:45.447372 kubelet[2826]: I1216 02:08:45.447232 2826 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 02:08:45.447372 kubelet[2826]: E1216 02:08:45.447297 2826 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 02:08:45.483826 kubelet[2826]: I1216 02:08:45.483583 2826 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 02:08:45.483826 kubelet[2826]: I1216 02:08:45.483609 2826 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 02:08:45.483826 kubelet[2826]: I1216 02:08:45.483641 2826 state_mem.go:36] "Initialized new in-memory state store" Dec 16 02:08:45.483826 kubelet[2826]: I1216 02:08:45.483808 2826 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 02:08:45.483826 kubelet[2826]: I1216 02:08:45.483821 2826 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 02:08:45.483826 kubelet[2826]: I1216 02:08:45.483839 2826 policy_none.go:49] "None policy: Start" Dec 16 02:08:45.484217 kubelet[2826]: I1216 02:08:45.483849 2826 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 02:08:45.484217 kubelet[2826]: I1216 02:08:45.483859 2826 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 02:08:45.484217 kubelet[2826]: I1216 02:08:45.483971 2826 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 16 02:08:45.484217 kubelet[2826]: I1216 02:08:45.483979 2826 policy_none.go:47] "Start" Dec 16 02:08:45.492380 kubelet[2826]: E1216 02:08:45.492320 2826 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 02:08:45.492825 kubelet[2826]: I1216 02:08:45.492731 2826 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 02:08:45.492825 kubelet[2826]: I1216 02:08:45.492756 2826 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 02:08:45.494159 kubelet[2826]: I1216 02:08:45.493976 2826 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 02:08:45.497658 kubelet[2826]: E1216 02:08:45.497618 2826 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 02:08:45.549664 kubelet[2826]: I1216 02:08:45.548534 2826 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.549664 kubelet[2826]: I1216 02:08:45.548764 2826 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.549664 kubelet[2826]: I1216 02:08:45.549246 2826 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.597075 kubelet[2826]: I1216 02:08:45.596911 2826 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.613712 kubelet[2826]: I1216 02:08:45.613552 2826 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.614015 kubelet[2826]: I1216 02:08:45.613882 2826 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.703206 kubelet[2826]: I1216 02:08:45.703003 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.703601 kubelet[2826]: I1216 02:08:45.703485 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.703601 kubelet[2826]: I1216 02:08:45.703537 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0d0bca379f544505ddc4ae259ee17262-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-9-be0981937a\" (UID: \"0d0bca379f544505ddc4ae259ee17262\") " pod="kube-system/kube-scheduler-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.703601 kubelet[2826]: I1216 02:08:45.703556 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/883466bef02aa67af54c1ed2b3d23367-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-9-be0981937a\" (UID: \"883466bef02aa67af54c1ed2b3d23367\") " pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.703601 kubelet[2826]: I1216 02:08:45.703572 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.704300 kubelet[2826]: I1216 02:08:45.703590 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.704300 kubelet[2826]: I1216 02:08:45.704164 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fc13fbcaf0d55aea76ba67ff2904fc40-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-9-be0981937a\" (UID: \"fc13fbcaf0d55aea76ba67ff2904fc40\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.704480 kubelet[2826]: I1216 02:08:45.704396 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/883466bef02aa67af54c1ed2b3d23367-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-9-be0981937a\" (UID: \"883466bef02aa67af54c1ed2b3d23367\") " pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:45.704480 kubelet[2826]: I1216 02:08:45.704430 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/883466bef02aa67af54c1ed2b3d23367-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-9-be0981937a\" (UID: \"883466bef02aa67af54c1ed2b3d23367\") " pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:46.371765 kubelet[2826]: I1216 02:08:46.371442 2826 apiserver.go:52] "Watching apiserver" Dec 16 02:08:46.402551 kubelet[2826]: I1216 02:08:46.402496 2826 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 02:08:46.430964 kubelet[2826]: I1216 02:08:46.430694 2826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-9-be0981937a" podStartSLOduration=1.430677195 podStartE2EDuration="1.430677195s" podCreationTimestamp="2025-12-16 02:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 02:08:46.416758473 +0000 UTC m=+1.134023476" watchObservedRunningTime="2025-12-16 02:08:46.430677195 +0000 UTC m=+1.147942197" Dec 16 02:08:46.448398 kubelet[2826]: I1216 02:08:46.448170 2826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-9-be0981937a" podStartSLOduration=1.448151899 podStartE2EDuration="1.448151899s" podCreationTimestamp="2025-12-16 02:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 02:08:46.432181945 +0000 UTC m=+1.149446947" watchObservedRunningTime="2025-12-16 02:08:46.448151899 +0000 UTC m=+1.165416901" Dec 16 02:08:46.463272 kubelet[2826]: I1216 02:08:46.463133 2826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" podStartSLOduration=1.46311418 podStartE2EDuration="1.46311418s" podCreationTimestamp="2025-12-16 02:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 02:08:46.450305148 +0000 UTC m=+1.167570190" watchObservedRunningTime="2025-12-16 02:08:46.46311418 +0000 UTC m=+1.180379182" Dec 16 02:08:46.468062 kubelet[2826]: I1216 02:08:46.467761 2826 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-9-be0981937a" Dec 16 02:08:46.470169 kubelet[2826]: I1216 02:08:46.470133 2826 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:46.480118 kubelet[2826]: E1216 02:08:46.480060 2826 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-9-be0981937a\" already exists" pod="kube-system/kube-scheduler-ci-4547-0-0-9-be0981937a" Dec 16 02:08:46.481356 kubelet[2826]: E1216 02:08:46.481317 2826 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-9-be0981937a\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-9-be0981937a" Dec 16 02:08:50.844545 kubelet[2826]: I1216 02:08:50.844487 2826 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 02:08:50.847059 containerd[1587]: time="2025-12-16T02:08:50.846019288Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 02:08:50.847663 kubelet[2826]: I1216 02:08:50.846505 2826 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 02:08:51.727170 systemd[1]: Created slice kubepods-besteffort-pod766a31f6_c7e9_422d_881a_543647c1ef6e.slice - libcontainer container kubepods-besteffort-pod766a31f6_c7e9_422d_881a_543647c1ef6e.slice. Dec 16 02:08:51.746061 kubelet[2826]: I1216 02:08:51.744696 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/766a31f6-c7e9-422d-881a-543647c1ef6e-kube-proxy\") pod \"kube-proxy-mtf74\" (UID: \"766a31f6-c7e9-422d-881a-543647c1ef6e\") " pod="kube-system/kube-proxy-mtf74" Dec 16 02:08:51.746061 kubelet[2826]: I1216 02:08:51.744739 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/766a31f6-c7e9-422d-881a-543647c1ef6e-lib-modules\") pod \"kube-proxy-mtf74\" (UID: \"766a31f6-c7e9-422d-881a-543647c1ef6e\") " pod="kube-system/kube-proxy-mtf74" Dec 16 02:08:51.746061 kubelet[2826]: I1216 02:08:51.744764 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr42q\" (UniqueName: \"kubernetes.io/projected/766a31f6-c7e9-422d-881a-543647c1ef6e-kube-api-access-tr42q\") pod \"kube-proxy-mtf74\" (UID: \"766a31f6-c7e9-422d-881a-543647c1ef6e\") " pod="kube-system/kube-proxy-mtf74" Dec 16 02:08:51.746061 kubelet[2826]: I1216 02:08:51.744783 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/766a31f6-c7e9-422d-881a-543647c1ef6e-xtables-lock\") pod \"kube-proxy-mtf74\" (UID: \"766a31f6-c7e9-422d-881a-543647c1ef6e\") " pod="kube-system/kube-proxy-mtf74" Dec 16 02:08:51.856860 kubelet[2826]: E1216 02:08:51.856814 2826 projected.go:291] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Dec 16 02:08:51.856860 kubelet[2826]: E1216 02:08:51.856860 2826 projected.go:196] Error preparing data for projected volume kube-api-access-tr42q for pod kube-system/kube-proxy-mtf74: configmap "kube-root-ca.crt" not found Dec 16 02:08:51.858732 kubelet[2826]: E1216 02:08:51.856935 2826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/766a31f6-c7e9-422d-881a-543647c1ef6e-kube-api-access-tr42q podName:766a31f6-c7e9-422d-881a-543647c1ef6e nodeName:}" failed. No retries permitted until 2025-12-16 02:08:52.356912522 +0000 UTC m=+7.074177524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tr42q" (UniqueName: "kubernetes.io/projected/766a31f6-c7e9-422d-881a-543647c1ef6e-kube-api-access-tr42q") pod "kube-proxy-mtf74" (UID: "766a31f6-c7e9-422d-881a-543647c1ef6e") : configmap "kube-root-ca.crt" not found Dec 16 02:08:52.048086 systemd[1]: Created slice kubepods-besteffort-pod16314a29_f84d_449a_81b2_ef2731bda6f4.slice - libcontainer container kubepods-besteffort-pod16314a29_f84d_449a_81b2_ef2731bda6f4.slice. Dec 16 02:08:52.148286 kubelet[2826]: I1216 02:08:52.148228 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92p9t\" (UniqueName: \"kubernetes.io/projected/16314a29-f84d-449a-81b2-ef2731bda6f4-kube-api-access-92p9t\") pod \"tigera-operator-65cdcdfd6d-crjgd\" (UID: \"16314a29-f84d-449a-81b2-ef2731bda6f4\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-crjgd" Dec 16 02:08:52.148286 kubelet[2826]: I1216 02:08:52.148297 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/16314a29-f84d-449a-81b2-ef2731bda6f4-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-crjgd\" (UID: \"16314a29-f84d-449a-81b2-ef2731bda6f4\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-crjgd" Dec 16 02:08:52.355857 containerd[1587]: time="2025-12-16T02:08:52.355637205Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-crjgd,Uid:16314a29-f84d-449a-81b2-ef2731bda6f4,Namespace:tigera-operator,Attempt:0,}" Dec 16 02:08:52.378713 containerd[1587]: time="2025-12-16T02:08:52.378606563Z" level=info msg="connecting to shim 308c511d8fce666b8915ae8dfb8b3cd7fb1cf254d057585b85c4bd6687cdc4ea" address="unix:///run/containerd/s/eea70695d8e1f8d312e6eac1961b1d75231882d918380480220855a700a81cde" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:08:52.409406 systemd[1]: Started cri-containerd-308c511d8fce666b8915ae8dfb8b3cd7fb1cf254d057585b85c4bd6687cdc4ea.scope - libcontainer container 308c511d8fce666b8915ae8dfb8b3cd7fb1cf254d057585b85c4bd6687cdc4ea. Dec 16 02:08:52.427873 kernel: kauditd_printk_skb: 32 callbacks suppressed Dec 16 02:08:52.427988 kernel: audit: type=1334 audit(1765850932.425:430): prog-id=133 op=LOAD Dec 16 02:08:52.425000 audit: BPF prog-id=133 op=LOAD Dec 16 02:08:52.429041 kernel: audit: type=1334 audit(1765850932.427:431): prog-id=134 op=LOAD Dec 16 02:08:52.429121 kernel: audit: type=1300 audit(1765850932.427:431): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.427000 audit: BPF prog-id=134 op=LOAD Dec 16 02:08:52.427000 audit[2900]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.427000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.434514 kernel: audit: type=1327 audit(1765850932.427:431): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.427000 audit: BPF prog-id=134 op=UNLOAD Dec 16 02:08:52.435127 kernel: audit: type=1334 audit(1765850932.427:432): prog-id=134 op=UNLOAD Dec 16 02:08:52.435172 kernel: audit: type=1300 audit(1765850932.427:432): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.427000 audit[2900]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.427000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.439655 kernel: audit: type=1327 audit(1765850932.427:432): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.427000 audit: BPF prog-id=135 op=LOAD Dec 16 02:08:52.442763 kernel: audit: type=1334 audit(1765850932.427:433): prog-id=135 op=LOAD Dec 16 02:08:52.442811 kernel: audit: type=1300 audit(1765850932.427:433): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.427000 audit[2900]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.427000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.449056 kernel: audit: type=1327 audit(1765850932.427:433): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.431000 audit: BPF prog-id=136 op=LOAD Dec 16 02:08:52.431000 audit[2900]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.431000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.434000 audit: BPF prog-id=136 op=UNLOAD Dec 16 02:08:52.434000 audit[2900]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.434000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.434000 audit: BPF prog-id=135 op=UNLOAD Dec 16 02:08:52.434000 audit[2900]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.434000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.434000 audit: BPF prog-id=137 op=LOAD Dec 16 02:08:52.434000 audit[2900]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2889 pid=2900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.434000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330386335313164386663653636366238393135616538646662386233 Dec 16 02:08:52.475348 containerd[1587]: time="2025-12-16T02:08:52.475310281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-crjgd,Uid:16314a29-f84d-449a-81b2-ef2731bda6f4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"308c511d8fce666b8915ae8dfb8b3cd7fb1cf254d057585b85c4bd6687cdc4ea\"" Dec 16 02:08:52.477872 containerd[1587]: time="2025-12-16T02:08:52.477615150Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 02:08:52.639283 containerd[1587]: time="2025-12-16T02:08:52.639170158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mtf74,Uid:766a31f6-c7e9-422d-881a-543647c1ef6e,Namespace:kube-system,Attempt:0,}" Dec 16 02:08:52.665700 containerd[1587]: time="2025-12-16T02:08:52.665252213Z" level=info msg="connecting to shim 09608c10f1bf9c5815cd25d918622da287f5bec79da129af58f9dc767989e185" address="unix:///run/containerd/s/00f051a17047c926f676c75a40ed931d632ba1ede7ba67d12c387602b7c24002" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:08:52.693469 systemd[1]: Started cri-containerd-09608c10f1bf9c5815cd25d918622da287f5bec79da129af58f9dc767989e185.scope - libcontainer container 09608c10f1bf9c5815cd25d918622da287f5bec79da129af58f9dc767989e185. Dec 16 02:08:52.705000 audit: BPF prog-id=138 op=LOAD Dec 16 02:08:52.705000 audit: BPF prog-id=139 op=LOAD Dec 16 02:08:52.705000 audit[2948]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2935 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.705000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039363038633130663162663963353831356364323564393138363232 Dec 16 02:08:52.706000 audit: BPF prog-id=139 op=UNLOAD Dec 16 02:08:52.706000 audit[2948]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2935 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039363038633130663162663963353831356364323564393138363232 Dec 16 02:08:52.706000 audit: BPF prog-id=140 op=LOAD Dec 16 02:08:52.706000 audit[2948]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2935 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039363038633130663162663963353831356364323564393138363232 Dec 16 02:08:52.706000 audit: BPF prog-id=141 op=LOAD Dec 16 02:08:52.706000 audit[2948]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2935 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039363038633130663162663963353831356364323564393138363232 Dec 16 02:08:52.706000 audit: BPF prog-id=141 op=UNLOAD Dec 16 02:08:52.706000 audit[2948]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2935 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039363038633130663162663963353831356364323564393138363232 Dec 16 02:08:52.706000 audit: BPF prog-id=140 op=UNLOAD Dec 16 02:08:52.706000 audit[2948]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2935 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039363038633130663162663963353831356364323564393138363232 Dec 16 02:08:52.707000 audit: BPF prog-id=142 op=LOAD Dec 16 02:08:52.707000 audit[2948]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2935 pid=2948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.707000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039363038633130663162663963353831356364323564393138363232 Dec 16 02:08:52.725379 containerd[1587]: time="2025-12-16T02:08:52.725339704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mtf74,Uid:766a31f6-c7e9-422d-881a-543647c1ef6e,Namespace:kube-system,Attempt:0,} returns sandbox id \"09608c10f1bf9c5815cd25d918622da287f5bec79da129af58f9dc767989e185\"" Dec 16 02:08:52.733929 containerd[1587]: time="2025-12-16T02:08:52.733889330Z" level=info msg="CreateContainer within sandbox \"09608c10f1bf9c5815cd25d918622da287f5bec79da129af58f9dc767989e185\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 02:08:52.744686 containerd[1587]: time="2025-12-16T02:08:52.744615207Z" level=info msg="Container 64ed50da386c76f35e187e03e1f6b24617761a4b3ac2a691ce4c8e5d620f7231: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:08:52.759477 containerd[1587]: time="2025-12-16T02:08:52.759386146Z" level=info msg="CreateContainer within sandbox \"09608c10f1bf9c5815cd25d918622da287f5bec79da129af58f9dc767989e185\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"64ed50da386c76f35e187e03e1f6b24617761a4b3ac2a691ce4c8e5d620f7231\"" Dec 16 02:08:52.763515 containerd[1587]: time="2025-12-16T02:08:52.762883015Z" level=info msg="StartContainer for \"64ed50da386c76f35e187e03e1f6b24617761a4b3ac2a691ce4c8e5d620f7231\"" Dec 16 02:08:52.765399 containerd[1587]: time="2025-12-16T02:08:52.765357866Z" level=info msg="connecting to shim 64ed50da386c76f35e187e03e1f6b24617761a4b3ac2a691ce4c8e5d620f7231" address="unix:///run/containerd/s/00f051a17047c926f676c75a40ed931d632ba1ede7ba67d12c387602b7c24002" protocol=ttrpc version=3 Dec 16 02:08:52.785310 systemd[1]: Started cri-containerd-64ed50da386c76f35e187e03e1f6b24617761a4b3ac2a691ce4c8e5d620f7231.scope - libcontainer container 64ed50da386c76f35e187e03e1f6b24617761a4b3ac2a691ce4c8e5d620f7231. Dec 16 02:08:52.838000 audit: BPF prog-id=143 op=LOAD Dec 16 02:08:52.838000 audit[2973]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2935 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.838000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634656435306461333836633736663335653138376530336531663662 Dec 16 02:08:52.838000 audit: BPF prog-id=144 op=LOAD Dec 16 02:08:52.838000 audit[2973]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2935 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.838000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634656435306461333836633736663335653138376530336531663662 Dec 16 02:08:52.838000 audit: BPF prog-id=144 op=UNLOAD Dec 16 02:08:52.838000 audit[2973]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2935 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.838000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634656435306461333836633736663335653138376530336531663662 Dec 16 02:08:52.838000 audit: BPF prog-id=143 op=UNLOAD Dec 16 02:08:52.838000 audit[2973]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2935 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.838000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634656435306461333836633736663335653138376530336531663662 Dec 16 02:08:52.838000 audit: BPF prog-id=145 op=LOAD Dec 16 02:08:52.838000 audit[2973]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2935 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:52.838000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634656435306461333836633736663335653138376530336531663662 Dec 16 02:08:52.869224 containerd[1587]: time="2025-12-16T02:08:52.869180698Z" level=info msg="StartContainer for \"64ed50da386c76f35e187e03e1f6b24617761a4b3ac2a691ce4c8e5d620f7231\" returns successfully" Dec 16 02:08:53.119000 audit[3037]: NETFILTER_CFG table=mangle:54 family=10 entries=1 op=nft_register_chain pid=3037 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.119000 audit[3037]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcfb4d2c0 a2=0 a3=1 items=0 ppid=2986 pid=3037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.119000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 02:08:53.121000 audit[3039]: NETFILTER_CFG table=nat:55 family=10 entries=1 op=nft_register_chain pid=3039 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.121000 audit[3039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdc709ad0 a2=0 a3=1 items=0 ppid=2986 pid=3039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.121000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 02:08:53.121000 audit[3038]: NETFILTER_CFG table=mangle:56 family=2 entries=1 op=nft_register_chain pid=3038 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.121000 audit[3038]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc06eb3c0 a2=0 a3=1 items=0 ppid=2986 pid=3038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.121000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 02:08:53.122000 audit[3040]: NETFILTER_CFG table=filter:57 family=10 entries=1 op=nft_register_chain pid=3040 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.122000 audit[3040]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc5aa7810 a2=0 a3=1 items=0 ppid=2986 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.122000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 02:08:53.123000 audit[3042]: NETFILTER_CFG table=nat:58 family=2 entries=1 op=nft_register_chain pid=3042 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.123000 audit[3042]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe23b79b0 a2=0 a3=1 items=0 ppid=2986 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.123000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 02:08:53.125000 audit[3043]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.125000 audit[3043]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc37a0d20 a2=0 a3=1 items=0 ppid=2986 pid=3043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.125000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 02:08:53.227000 audit[3045]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.227000 audit[3045]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffc4103270 a2=0 a3=1 items=0 ppid=2986 pid=3045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.227000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 02:08:53.231000 audit[3047]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3047 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.231000 audit[3047]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffc4f03de0 a2=0 a3=1 items=0 ppid=2986 pid=3047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.231000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Dec 16 02:08:53.236000 audit[3050]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.236000 audit[3050]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd2ed7e00 a2=0 a3=1 items=0 ppid=2986 pid=3050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.236000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 02:08:53.238000 audit[3051]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.238000 audit[3051]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcdefdd30 a2=0 a3=1 items=0 ppid=2986 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.238000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 02:08:53.243000 audit[3053]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.243000 audit[3053]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffffe1a2fe0 a2=0 a3=1 items=0 ppid=2986 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.243000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 02:08:53.245000 audit[3054]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.245000 audit[3054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe0a16dc0 a2=0 a3=1 items=0 ppid=2986 pid=3054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.245000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 02:08:53.250000 audit[3056]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.250000 audit[3056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffcd7cd590 a2=0 a3=1 items=0 ppid=2986 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.250000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 02:08:53.256000 audit[3059]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3059 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.256000 audit[3059]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffc7f09c60 a2=0 a3=1 items=0 ppid=2986 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.256000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 02:08:53.258000 audit[3060]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3060 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.258000 audit[3060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd9d1fc70 a2=0 a3=1 items=0 ppid=2986 pid=3060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.258000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 02:08:53.261000 audit[3062]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3062 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.261000 audit[3062]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd24966d0 a2=0 a3=1 items=0 ppid=2986 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.261000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 02:08:53.266000 audit[3063]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.266000 audit[3063]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc46e5330 a2=0 a3=1 items=0 ppid=2986 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.266000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 02:08:53.272000 audit[3065]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.272000 audit[3065]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd2cf3680 a2=0 a3=1 items=0 ppid=2986 pid=3065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.272000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Dec 16 02:08:53.276000 audit[3068]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3068 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.276000 audit[3068]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff9320120 a2=0 a3=1 items=0 ppid=2986 pid=3068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.276000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 02:08:53.280000 audit[3071]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3071 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.280000 audit[3071]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffebc325d0 a2=0 a3=1 items=0 ppid=2986 pid=3071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.280000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 02:08:53.282000 audit[3072]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3072 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.282000 audit[3072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffebb83b20 a2=0 a3=1 items=0 ppid=2986 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.282000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 02:08:53.285000 audit[3074]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3074 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.285000 audit[3074]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffe90592a0 a2=0 a3=1 items=0 ppid=2986 pid=3074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.285000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 02:08:53.289000 audit[3077]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3077 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.289000 audit[3077]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffebbc7090 a2=0 a3=1 items=0 ppid=2986 pid=3077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.289000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 02:08:53.290000 audit[3078]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3078 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.290000 audit[3078]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc59b9c70 a2=0 a3=1 items=0 ppid=2986 pid=3078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.290000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 02:08:53.296000 audit[3080]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3080 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 02:08:53.296000 audit[3080]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffe385a060 a2=0 a3=1 items=0 ppid=2986 pid=3080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.296000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 02:08:53.320000 audit[3086]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3086 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:08:53.320000 audit[3086]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffea0ae470 a2=0 a3=1 items=0 ppid=2986 pid=3086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.320000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:08:53.335000 audit[3086]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3086 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:08:53.335000 audit[3086]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffea0ae470 a2=0 a3=1 items=0 ppid=2986 pid=3086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.335000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:08:53.337000 audit[3092]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.337000 audit[3092]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffd94f0ff0 a2=0 a3=1 items=0 ppid=2986 pid=3092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.337000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 02:08:53.340000 audit[3094]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.340000 audit[3094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffee0d11f0 a2=0 a3=1 items=0 ppid=2986 pid=3094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.340000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 02:08:53.344000 audit[3097]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.344000 audit[3097]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffffab7e5c0 a2=0 a3=1 items=0 ppid=2986 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.344000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Dec 16 02:08:53.346000 audit[3098]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3098 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.346000 audit[3098]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd7a38d70 a2=0 a3=1 items=0 ppid=2986 pid=3098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.346000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 02:08:53.348000 audit[3100]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.348000 audit[3100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff28272f0 a2=0 a3=1 items=0 ppid=2986 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.348000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 02:08:53.350000 audit[3101]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3101 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.350000 audit[3101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe89c3db0 a2=0 a3=1 items=0 ppid=2986 pid=3101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.350000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 02:08:53.354000 audit[3103]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3103 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.354000 audit[3103]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffffee79a00 a2=0 a3=1 items=0 ppid=2986 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.354000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 02:08:53.360000 audit[3106]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.360000 audit[3106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffd3c36760 a2=0 a3=1 items=0 ppid=2986 pid=3106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.360000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 02:08:53.362000 audit[3107]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3107 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.362000 audit[3107]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc802ff70 a2=0 a3=1 items=0 ppid=2986 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.362000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 02:08:53.366000 audit[3109]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.366000 audit[3109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffffe9a42a0 a2=0 a3=1 items=0 ppid=2986 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.366000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 02:08:53.367000 audit[3110]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3110 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.367000 audit[3110]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc90ab5d0 a2=0 a3=1 items=0 ppid=2986 pid=3110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.367000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 02:08:53.370000 audit[3112]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.370000 audit[3112]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffdcd4b480 a2=0 a3=1 items=0 ppid=2986 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.370000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 02:08:53.375000 audit[3115]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3115 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.375000 audit[3115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcf4b3290 a2=0 a3=1 items=0 ppid=2986 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.375000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 02:08:53.379000 audit[3118]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3118 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.379000 audit[3118]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff122a540 a2=0 a3=1 items=0 ppid=2986 pid=3118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.379000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Dec 16 02:08:53.381000 audit[3119]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3119 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.381000 audit[3119]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe0ab4080 a2=0 a3=1 items=0 ppid=2986 pid=3119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.381000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 02:08:53.385000 audit[3121]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.385000 audit[3121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffc233ec60 a2=0 a3=1 items=0 ppid=2986 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.385000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 02:08:53.389000 audit[3124]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.389000 audit[3124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd2e60b60 a2=0 a3=1 items=0 ppid=2986 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.389000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 02:08:53.395000 audit[3125]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3125 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.395000 audit[3125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff36a0110 a2=0 a3=1 items=0 ppid=2986 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.395000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 02:08:53.399000 audit[3127]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.399000 audit[3127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffe39708c0 a2=0 a3=1 items=0 ppid=2986 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.399000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 02:08:53.401000 audit[3128]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.401000 audit[3128]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc28d2410 a2=0 a3=1 items=0 ppid=2986 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.401000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 02:08:53.404000 audit[3130]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3130 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.404000 audit[3130]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd28a3170 a2=0 a3=1 items=0 ppid=2986 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.404000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 02:08:53.407000 audit[3133]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3133 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 02:08:53.407000 audit[3133]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffec528bb0 a2=0 a3=1 items=0 ppid=2986 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.407000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 02:08:53.411000 audit[3135]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3135 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 02:08:53.411000 audit[3135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffef7c3900 a2=0 a3=1 items=0 ppid=2986 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.411000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:08:53.412000 audit[3135]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3135 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 02:08:53.412000 audit[3135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffef7c3900 a2=0 a3=1 items=0 ppid=2986 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:53.412000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:08:53.894877 kubelet[2826]: I1216 02:08:53.894602 2826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-mtf74" podStartSLOduration=2.89458268 podStartE2EDuration="2.89458268s" podCreationTimestamp="2025-12-16 02:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 02:08:53.520500706 +0000 UTC m=+8.237765708" watchObservedRunningTime="2025-12-16 02:08:53.89458268 +0000 UTC m=+8.611847682" Dec 16 02:08:54.143993 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1614251478.mount: Deactivated successfully. Dec 16 02:08:54.561404 containerd[1587]: time="2025-12-16T02:08:54.561328315Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:54.564735 containerd[1587]: time="2025-12-16T02:08:54.563110865Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=0" Dec 16 02:08:54.568324 containerd[1587]: time="2025-12-16T02:08:54.567220637Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:54.570392 containerd[1587]: time="2025-12-16T02:08:54.570332279Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:08:54.571264 containerd[1587]: time="2025-12-16T02:08:54.571194190Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.093529954s" Dec 16 02:08:54.571680 containerd[1587]: time="2025-12-16T02:08:54.571628607Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 16 02:08:54.576275 containerd[1587]: time="2025-12-16T02:08:54.576232282Z" level=info msg="CreateContainer within sandbox \"308c511d8fce666b8915ae8dfb8b3cd7fb1cf254d057585b85c4bd6687cdc4ea\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 02:08:54.585695 containerd[1587]: time="2025-12-16T02:08:54.585644458Z" level=info msg="Container 925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:08:54.594301 containerd[1587]: time="2025-12-16T02:08:54.594233849Z" level=info msg="CreateContainer within sandbox \"308c511d8fce666b8915ae8dfb8b3cd7fb1cf254d057585b85c4bd6687cdc4ea\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605\"" Dec 16 02:08:54.596046 containerd[1587]: time="2025-12-16T02:08:54.595994436Z" level=info msg="StartContainer for \"925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605\"" Dec 16 02:08:54.596937 containerd[1587]: time="2025-12-16T02:08:54.596879751Z" level=info msg="connecting to shim 925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605" address="unix:///run/containerd/s/eea70695d8e1f8d312e6eac1961b1d75231882d918380480220855a700a81cde" protocol=ttrpc version=3 Dec 16 02:08:54.620239 systemd[1]: Started cri-containerd-925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605.scope - libcontainer container 925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605. Dec 16 02:08:54.634000 audit: BPF prog-id=146 op=LOAD Dec 16 02:08:54.635000 audit: BPF prog-id=147 op=LOAD Dec 16 02:08:54.635000 audit[3144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=2889 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:54.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932356535383965333032326462383338643261303665386538616661 Dec 16 02:08:54.635000 audit: BPF prog-id=147 op=UNLOAD Dec 16 02:08:54.635000 audit[3144]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:54.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932356535383965333032326462383338643261303665386538616661 Dec 16 02:08:54.635000 audit: BPF prog-id=148 op=LOAD Dec 16 02:08:54.635000 audit[3144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=2889 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:54.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932356535383965333032326462383338643261303665386538616661 Dec 16 02:08:54.636000 audit: BPF prog-id=149 op=LOAD Dec 16 02:08:54.636000 audit[3144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=2889 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:54.636000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932356535383965333032326462383338643261303665386538616661 Dec 16 02:08:54.636000 audit: BPF prog-id=149 op=UNLOAD Dec 16 02:08:54.636000 audit[3144]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:54.636000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932356535383965333032326462383338643261303665386538616661 Dec 16 02:08:54.637000 audit: BPF prog-id=148 op=UNLOAD Dec 16 02:08:54.637000 audit[3144]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:54.637000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932356535383965333032326462383338643261303665386538616661 Dec 16 02:08:54.637000 audit: BPF prog-id=150 op=LOAD Dec 16 02:08:54.637000 audit[3144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=2889 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:08:54.637000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932356535383965333032326462383338643261303665386538616661 Dec 16 02:08:54.663954 containerd[1587]: time="2025-12-16T02:08:54.663731713Z" level=info msg="StartContainer for \"925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605\" returns successfully" Dec 16 02:09:00.831118 sudo[1900]: pam_unix(sudo:session): session closed for user root Dec 16 02:09:00.830000 audit[1900]: USER_END pid=1900 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:09:00.835094 kernel: kauditd_printk_skb: 224 callbacks suppressed Dec 16 02:09:00.835172 kernel: audit: type=1106 audit(1765850940.830:510): pid=1900 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:09:00.835197 kernel: audit: type=1104 audit(1765850940.830:511): pid=1900 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:09:00.830000 audit[1900]: CRED_DISP pid=1900 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 02:09:00.991270 sshd[1899]: Connection closed by 139.178.68.195 port 40146 Dec 16 02:09:00.993315 sshd-session[1875]: pam_unix(sshd:session): session closed for user core Dec 16 02:09:00.993000 audit[1875]: USER_END pid=1875 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:09:00.993000 audit[1875]: CRED_DISP pid=1875 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:09:01.001663 kernel: audit: type=1106 audit(1765850940.993:512): pid=1875 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:09:01.001749 kernel: audit: type=1104 audit(1765850940.993:513): pid=1875 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:09:01.002500 systemd[1]: sshd@6-49.13.61.135:22-139.178.68.195:40146.service: Deactivated successfully. Dec 16 02:09:01.001000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-49.13.61.135:22-139.178.68.195:40146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:09:01.006200 kernel: audit: type=1131 audit(1765850941.001:514): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-49.13.61.135:22-139.178.68.195:40146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:09:01.006628 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 02:09:01.008207 systemd[1]: session-8.scope: Consumed 7.923s CPU time, 221.5M memory peak. Dec 16 02:09:01.012442 systemd-logind[1564]: Session 8 logged out. Waiting for processes to exit. Dec 16 02:09:01.015286 systemd-logind[1564]: Removed session 8. Dec 16 02:09:04.141510 kernel: audit: type=1325 audit(1765850944.136:515): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:04.141638 kernel: audit: type=1300 audit(1765850944.136:515): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffc72e2290 a2=0 a3=1 items=0 ppid=2986 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:04.136000 audit[3224]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:04.136000 audit[3224]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffc72e2290 a2=0 a3=1 items=0 ppid=2986 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:04.136000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:04.144132 kernel: audit: type=1327 audit(1765850944.136:515): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:04.147000 audit[3224]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:04.147000 audit[3224]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc72e2290 a2=0 a3=1 items=0 ppid=2986 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:04.154059 kernel: audit: type=1325 audit(1765850944.147:516): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:04.154154 kernel: audit: type=1300 audit(1765850944.147:516): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc72e2290 a2=0 a3=1 items=0 ppid=2986 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:04.147000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:05.165000 audit[3226]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:05.165000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffdc0cffd0 a2=0 a3=1 items=0 ppid=2986 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:05.165000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:05.171000 audit[3226]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:05.171000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdc0cffd0 a2=0 a3=1 items=0 ppid=2986 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:05.171000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:09.022000 audit[3229]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3229 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:09.025180 kernel: kauditd_printk_skb: 7 callbacks suppressed Dec 16 02:09:09.025238 kernel: audit: type=1325 audit(1765850949.022:519): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3229 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:09.022000 audit[3229]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff8c6cfb0 a2=0 a3=1 items=0 ppid=2986 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:09.028494 kernel: audit: type=1300 audit(1765850949.022:519): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff8c6cfb0 a2=0 a3=1 items=0 ppid=2986 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:09.022000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:09.030109 kernel: audit: type=1327 audit(1765850949.022:519): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:09.037000 audit[3229]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3229 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:09.037000 audit[3229]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff8c6cfb0 a2=0 a3=1 items=0 ppid=2986 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:09.043506 kernel: audit: type=1325 audit(1765850949.037:520): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3229 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:09.043580 kernel: audit: type=1300 audit(1765850949.037:520): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff8c6cfb0 a2=0 a3=1 items=0 ppid=2986 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:09.037000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:09.045635 kernel: audit: type=1327 audit(1765850949.037:520): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:10.064000 audit[3232]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:10.064000 audit[3232]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffffe767d60 a2=0 a3=1 items=0 ppid=2986 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:10.071563 kernel: audit: type=1325 audit(1765850950.064:521): table=filter:111 family=2 entries=19 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:10.071674 kernel: audit: type=1300 audit(1765850950.064:521): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffffe767d60 a2=0 a3=1 items=0 ppid=2986 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:10.064000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:10.073637 kernel: audit: type=1327 audit(1765850950.064:521): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:10.067000 audit[3232]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:10.075263 kernel: audit: type=1325 audit(1765850950.067:522): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:10.067000 audit[3232]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffe767d60 a2=0 a3=1 items=0 ppid=2986 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:10.067000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:12.739090 kubelet[2826]: I1216 02:09:12.738377 2826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-crjgd" podStartSLOduration=19.642325227 podStartE2EDuration="21.738276657s" podCreationTimestamp="2025-12-16 02:08:51 +0000 UTC" firstStartedPulling="2025-12-16 02:08:52.476935579 +0000 UTC m=+7.194200581" lastFinishedPulling="2025-12-16 02:08:54.572887009 +0000 UTC m=+9.290152011" observedRunningTime="2025-12-16 02:08:55.513498546 +0000 UTC m=+10.230763588" watchObservedRunningTime="2025-12-16 02:09:12.738276657 +0000 UTC m=+27.455541659" Dec 16 02:09:12.749976 systemd[1]: Created slice kubepods-besteffort-pod02d9702b_e9a7_4584_9874_f821dbebc469.slice - libcontainer container kubepods-besteffort-pod02d9702b_e9a7_4584_9874_f821dbebc469.slice. Dec 16 02:09:12.784551 kubelet[2826]: I1216 02:09:12.784334 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/02d9702b-e9a7-4584-9874-f821dbebc469-typha-certs\") pod \"calico-typha-55485b9c66-8zn8k\" (UID: \"02d9702b-e9a7-4584-9874-f821dbebc469\") " pod="calico-system/calico-typha-55485b9c66-8zn8k" Dec 16 02:09:12.784551 kubelet[2826]: I1216 02:09:12.784384 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7fxx\" (UniqueName: \"kubernetes.io/projected/02d9702b-e9a7-4584-9874-f821dbebc469-kube-api-access-x7fxx\") pod \"calico-typha-55485b9c66-8zn8k\" (UID: \"02d9702b-e9a7-4584-9874-f821dbebc469\") " pod="calico-system/calico-typha-55485b9c66-8zn8k" Dec 16 02:09:12.784551 kubelet[2826]: I1216 02:09:12.784404 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d9702b-e9a7-4584-9874-f821dbebc469-tigera-ca-bundle\") pod \"calico-typha-55485b9c66-8zn8k\" (UID: \"02d9702b-e9a7-4584-9874-f821dbebc469\") " pod="calico-system/calico-typha-55485b9c66-8zn8k" Dec 16 02:09:12.829000 audit[3234]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3234 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:12.829000 audit[3234]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffffb29c060 a2=0 a3=1 items=0 ppid=2986 pid=3234 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:12.829000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:12.835000 audit[3234]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3234 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:12.835000 audit[3234]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffb29c060 a2=0 a3=1 items=0 ppid=2986 pid=3234 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:12.835000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:12.964432 systemd[1]: Created slice kubepods-besteffort-pod1c82a029_4ad6_4f52_9c8c_df108579e407.slice - libcontainer container kubepods-besteffort-pod1c82a029_4ad6_4f52_9c8c_df108579e407.slice. Dec 16 02:09:13.061654 containerd[1587]: time="2025-12-16T02:09:13.061378634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55485b9c66-8zn8k,Uid:02d9702b-e9a7-4584-9874-f821dbebc469,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:13.085605 kubelet[2826]: I1216 02:09:13.085527 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c82a029-4ad6-4f52-9c8c-df108579e407-lib-modules\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.085919 kubelet[2826]: I1216 02:09:13.085680 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1c82a029-4ad6-4f52-9c8c-df108579e407-var-lib-calico\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.085919 kubelet[2826]: I1216 02:09:13.085823 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/1c82a029-4ad6-4f52-9c8c-df108579e407-node-certs\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.085919 kubelet[2826]: I1216 02:09:13.085844 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/1c82a029-4ad6-4f52-9c8c-df108579e407-policysync\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.085919 kubelet[2826]: I1216 02:09:13.085865 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/1c82a029-4ad6-4f52-9c8c-df108579e407-var-run-calico\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.086346 kubelet[2826]: I1216 02:09:13.086098 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/1c82a029-4ad6-4f52-9c8c-df108579e407-cni-bin-dir\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.086346 kubelet[2826]: I1216 02:09:13.086230 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/1c82a029-4ad6-4f52-9c8c-df108579e407-cni-log-dir\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.086346 kubelet[2826]: I1216 02:09:13.086253 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/1c82a029-4ad6-4f52-9c8c-df108579e407-cni-net-dir\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.086346 kubelet[2826]: I1216 02:09:13.086650 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/1c82a029-4ad6-4f52-9c8c-df108579e407-flexvol-driver-host\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.086346 kubelet[2826]: I1216 02:09:13.086702 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzr7c\" (UniqueName: \"kubernetes.io/projected/1c82a029-4ad6-4f52-9c8c-df108579e407-kube-api-access-wzr7c\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.087002 kubelet[2826]: I1216 02:09:13.086722 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c82a029-4ad6-4f52-9c8c-df108579e407-tigera-ca-bundle\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.087002 kubelet[2826]: I1216 02:09:13.086740 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1c82a029-4ad6-4f52-9c8c-df108579e407-xtables-lock\") pod \"calico-node-cztls\" (UID: \"1c82a029-4ad6-4f52-9c8c-df108579e407\") " pod="calico-system/calico-node-cztls" Dec 16 02:09:13.091015 containerd[1587]: time="2025-12-16T02:09:13.090933580Z" level=info msg="connecting to shim b0422c1f88f95aac23ed241384f3d7006bf671ed2312c1be101265da76341c8b" address="unix:///run/containerd/s/14d80120e117804fa123a6fb2194bbda042613de97df3718d0b32c966749d988" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:13.133314 systemd[1]: Started cri-containerd-b0422c1f88f95aac23ed241384f3d7006bf671ed2312c1be101265da76341c8b.scope - libcontainer container b0422c1f88f95aac23ed241384f3d7006bf671ed2312c1be101265da76341c8b. Dec 16 02:09:13.155000 audit: BPF prog-id=151 op=LOAD Dec 16 02:09:13.156000 audit: BPF prog-id=152 op=LOAD Dec 16 02:09:13.156000 audit[3257]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=3245 pid=3257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230343232633166383866393561616332336564323431333834663364 Dec 16 02:09:13.156000 audit: BPF prog-id=152 op=UNLOAD Dec 16 02:09:13.156000 audit[3257]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3245 pid=3257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.156000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230343232633166383866393561616332336564323431333834663364 Dec 16 02:09:13.157000 audit: BPF prog-id=153 op=LOAD Dec 16 02:09:13.157000 audit[3257]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=3245 pid=3257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230343232633166383866393561616332336564323431333834663364 Dec 16 02:09:13.157000 audit: BPF prog-id=154 op=LOAD Dec 16 02:09:13.157000 audit[3257]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=3245 pid=3257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230343232633166383866393561616332336564323431333834663364 Dec 16 02:09:13.157000 audit: BPF prog-id=154 op=UNLOAD Dec 16 02:09:13.157000 audit[3257]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3245 pid=3257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230343232633166383866393561616332336564323431333834663364 Dec 16 02:09:13.157000 audit: BPF prog-id=153 op=UNLOAD Dec 16 02:09:13.157000 audit[3257]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3245 pid=3257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230343232633166383866393561616332336564323431333834663364 Dec 16 02:09:13.157000 audit: BPF prog-id=155 op=LOAD Dec 16 02:09:13.157000 audit[3257]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=3245 pid=3257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.157000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6230343232633166383866393561616332336564323431333834663364 Dec 16 02:09:13.170099 kubelet[2826]: E1216 02:09:13.169972 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:13.201883 kubelet[2826]: E1216 02:09:13.201794 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.201883 kubelet[2826]: W1216 02:09:13.201827 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.201883 kubelet[2826]: E1216 02:09:13.201848 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.235602 kubelet[2826]: E1216 02:09:13.235499 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.236399 kubelet[2826]: W1216 02:09:13.235768 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.236399 kubelet[2826]: E1216 02:09:13.236130 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.246586 containerd[1587]: time="2025-12-16T02:09:13.246459942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55485b9c66-8zn8k,Uid:02d9702b-e9a7-4584-9874-f821dbebc469,Namespace:calico-system,Attempt:0,} returns sandbox id \"b0422c1f88f95aac23ed241384f3d7006bf671ed2312c1be101265da76341c8b\"" Dec 16 02:09:13.248711 kubelet[2826]: E1216 02:09:13.248566 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.248711 kubelet[2826]: W1216 02:09:13.248588 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.248711 kubelet[2826]: E1216 02:09:13.248609 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.249118 kubelet[2826]: E1216 02:09:13.249103 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.250823 kubelet[2826]: W1216 02:09:13.250747 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.251073 kubelet[2826]: E1216 02:09:13.250945 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.251353 kubelet[2826]: E1216 02:09:13.251276 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.252165 kubelet[2826]: W1216 02:09:13.252104 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.252756 kubelet[2826]: E1216 02:09:13.252297 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.252821 containerd[1587]: time="2025-12-16T02:09:13.252485609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 02:09:13.254323 kubelet[2826]: E1216 02:09:13.254106 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.254323 kubelet[2826]: W1216 02:09:13.254132 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.254323 kubelet[2826]: E1216 02:09:13.254148 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.254512 kubelet[2826]: E1216 02:09:13.254500 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.254580 kubelet[2826]: W1216 02:09:13.254568 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.254713 kubelet[2826]: E1216 02:09:13.254626 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.255753 kubelet[2826]: E1216 02:09:13.255686 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.255848 kubelet[2826]: W1216 02:09:13.255832 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.255945 kubelet[2826]: E1216 02:09:13.255918 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.257269 kubelet[2826]: E1216 02:09:13.257177 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.257269 kubelet[2826]: W1216 02:09:13.257199 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.257269 kubelet[2826]: E1216 02:09:13.257214 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.258064 kubelet[2826]: E1216 02:09:13.257695 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.258064 kubelet[2826]: W1216 02:09:13.257710 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.258064 kubelet[2826]: E1216 02:09:13.257723 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.260286 kubelet[2826]: E1216 02:09:13.260206 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.260286 kubelet[2826]: W1216 02:09:13.260222 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.260286 kubelet[2826]: E1216 02:09:13.260238 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.260869 kubelet[2826]: E1216 02:09:13.260803 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.260869 kubelet[2826]: W1216 02:09:13.260816 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.260869 kubelet[2826]: E1216 02:09:13.260829 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.261168 kubelet[2826]: E1216 02:09:13.261154 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.261311 kubelet[2826]: W1216 02:09:13.261245 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.261311 kubelet[2826]: E1216 02:09:13.261263 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.261631 kubelet[2826]: E1216 02:09:13.261567 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.261631 kubelet[2826]: W1216 02:09:13.261582 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.261631 kubelet[2826]: E1216 02:09:13.261592 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.262478 kubelet[2826]: E1216 02:09:13.262460 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.262992 kubelet[2826]: W1216 02:09:13.262914 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.262992 kubelet[2826]: E1216 02:09:13.262939 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.263916 kubelet[2826]: E1216 02:09:13.263842 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.263916 kubelet[2826]: W1216 02:09:13.263858 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.263916 kubelet[2826]: E1216 02:09:13.263871 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.264432 kubelet[2826]: E1216 02:09:13.264332 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.264720 kubelet[2826]: W1216 02:09:13.264529 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.264720 kubelet[2826]: E1216 02:09:13.264552 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.265229 kubelet[2826]: E1216 02:09:13.265214 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.265485 kubelet[2826]: W1216 02:09:13.265383 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.265485 kubelet[2826]: E1216 02:09:13.265411 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.266179 kubelet[2826]: E1216 02:09:13.266163 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.266460 kubelet[2826]: W1216 02:09:13.266350 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.266460 kubelet[2826]: E1216 02:09:13.266373 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.267257 kubelet[2826]: E1216 02:09:13.267103 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.267257 kubelet[2826]: W1216 02:09:13.267117 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.267257 kubelet[2826]: E1216 02:09:13.267129 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.267816 kubelet[2826]: E1216 02:09:13.267618 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.267816 kubelet[2826]: W1216 02:09:13.267630 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.267816 kubelet[2826]: E1216 02:09:13.267641 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.268475 kubelet[2826]: E1216 02:09:13.268376 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.268475 kubelet[2826]: W1216 02:09:13.268389 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.268475 kubelet[2826]: E1216 02:09:13.268401 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.273246 containerd[1587]: time="2025-12-16T02:09:13.273166100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cztls,Uid:1c82a029-4ad6-4f52-9c8c-df108579e407,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:13.289921 kubelet[2826]: E1216 02:09:13.289728 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.289921 kubelet[2826]: W1216 02:09:13.289752 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.289921 kubelet[2826]: E1216 02:09:13.289772 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.289921 kubelet[2826]: I1216 02:09:13.289813 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78tt\" (UniqueName: \"kubernetes.io/projected/919dd2b2-2bc2-4394-9fed-3f9f47f938e5-kube-api-access-g78tt\") pod \"csi-node-driver-rp5zk\" (UID: \"919dd2b2-2bc2-4394-9fed-3f9f47f938e5\") " pod="calico-system/csi-node-driver-rp5zk" Dec 16 02:09:13.290569 kubelet[2826]: E1216 02:09:13.290472 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.290569 kubelet[2826]: W1216 02:09:13.290489 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.290569 kubelet[2826]: E1216 02:09:13.290504 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.290569 kubelet[2826]: I1216 02:09:13.290533 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/919dd2b2-2bc2-4394-9fed-3f9f47f938e5-varrun\") pod \"csi-node-driver-rp5zk\" (UID: \"919dd2b2-2bc2-4394-9fed-3f9f47f938e5\") " pod="calico-system/csi-node-driver-rp5zk" Dec 16 02:09:13.290936 kubelet[2826]: E1216 02:09:13.290765 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.290936 kubelet[2826]: W1216 02:09:13.290787 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.290936 kubelet[2826]: E1216 02:09:13.290801 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.290936 kubelet[2826]: E1216 02:09:13.290936 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.291120 kubelet[2826]: W1216 02:09:13.290944 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.291120 kubelet[2826]: E1216 02:09:13.290952 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.291258 kubelet[2826]: E1216 02:09:13.291236 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.291258 kubelet[2826]: W1216 02:09:13.291249 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.291479 kubelet[2826]: E1216 02:09:13.291260 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.291479 kubelet[2826]: I1216 02:09:13.291283 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/919dd2b2-2bc2-4394-9fed-3f9f47f938e5-kubelet-dir\") pod \"csi-node-driver-rp5zk\" (UID: \"919dd2b2-2bc2-4394-9fed-3f9f47f938e5\") " pod="calico-system/csi-node-driver-rp5zk" Dec 16 02:09:13.291794 kubelet[2826]: E1216 02:09:13.291777 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.292020 kubelet[2826]: W1216 02:09:13.291869 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.292020 kubelet[2826]: E1216 02:09:13.291888 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.292352 kubelet[2826]: E1216 02:09:13.292245 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.292352 kubelet[2826]: W1216 02:09:13.292258 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.292352 kubelet[2826]: E1216 02:09:13.292269 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.292729 kubelet[2826]: E1216 02:09:13.292698 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.292729 kubelet[2826]: W1216 02:09:13.292714 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.292729 kubelet[2826]: E1216 02:09:13.292728 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.292932 kubelet[2826]: I1216 02:09:13.292754 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/919dd2b2-2bc2-4394-9fed-3f9f47f938e5-registration-dir\") pod \"csi-node-driver-rp5zk\" (UID: \"919dd2b2-2bc2-4394-9fed-3f9f47f938e5\") " pod="calico-system/csi-node-driver-rp5zk" Dec 16 02:09:13.293227 kubelet[2826]: E1216 02:09:13.293005 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.293227 kubelet[2826]: W1216 02:09:13.293158 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.293227 kubelet[2826]: E1216 02:09:13.293171 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.293741 kubelet[2826]: E1216 02:09:13.293722 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.293936 kubelet[2826]: W1216 02:09:13.293813 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.293936 kubelet[2826]: E1216 02:09:13.293834 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.294306 kubelet[2826]: E1216 02:09:13.294243 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.294306 kubelet[2826]: W1216 02:09:13.294257 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.294306 kubelet[2826]: E1216 02:09:13.294271 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.294522 kubelet[2826]: I1216 02:09:13.294497 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/919dd2b2-2bc2-4394-9fed-3f9f47f938e5-socket-dir\") pod \"csi-node-driver-rp5zk\" (UID: \"919dd2b2-2bc2-4394-9fed-3f9f47f938e5\") " pod="calico-system/csi-node-driver-rp5zk" Dec 16 02:09:13.295081 kubelet[2826]: E1216 02:09:13.294853 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.295081 kubelet[2826]: W1216 02:09:13.294874 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.295081 kubelet[2826]: E1216 02:09:13.294890 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.295338 kubelet[2826]: E1216 02:09:13.295307 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.295338 kubelet[2826]: W1216 02:09:13.295329 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.295430 kubelet[2826]: E1216 02:09:13.295343 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.296203 kubelet[2826]: E1216 02:09:13.296181 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.296203 kubelet[2826]: W1216 02:09:13.296199 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.296467 kubelet[2826]: E1216 02:09:13.296217 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.296467 kubelet[2826]: E1216 02:09:13.296372 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.296467 kubelet[2826]: W1216 02:09:13.296379 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.296467 kubelet[2826]: E1216 02:09:13.296387 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.311090 containerd[1587]: time="2025-12-16T02:09:13.310991594Z" level=info msg="connecting to shim a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b" address="unix:///run/containerd/s/495e23fe1e89345cd847743a52e6ce3721eef37a4b3d90c1569e30f042bdfdd4" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:13.343350 systemd[1]: Started cri-containerd-a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b.scope - libcontainer container a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b. Dec 16 02:09:13.360000 audit: BPF prog-id=156 op=LOAD Dec 16 02:09:13.361000 audit: BPF prog-id=157 op=LOAD Dec 16 02:09:13.361000 audit[3347]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138333734373265646435303731383331333564346230636137373036 Dec 16 02:09:13.361000 audit: BPF prog-id=157 op=UNLOAD Dec 16 02:09:13.361000 audit[3347]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138333734373265646435303731383331333564346230636137373036 Dec 16 02:09:13.361000 audit: BPF prog-id=158 op=LOAD Dec 16 02:09:13.361000 audit[3347]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138333734373265646435303731383331333564346230636137373036 Dec 16 02:09:13.361000 audit: BPF prog-id=159 op=LOAD Dec 16 02:09:13.361000 audit[3347]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138333734373265646435303731383331333564346230636137373036 Dec 16 02:09:13.361000 audit: BPF prog-id=159 op=UNLOAD Dec 16 02:09:13.361000 audit[3347]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138333734373265646435303731383331333564346230636137373036 Dec 16 02:09:13.361000 audit: BPF prog-id=158 op=UNLOAD Dec 16 02:09:13.361000 audit[3347]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138333734373265646435303731383331333564346230636137373036 Dec 16 02:09:13.362000 audit: BPF prog-id=160 op=LOAD Dec 16 02:09:13.362000 audit[3347]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3336 pid=3347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.362000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138333734373265646435303731383331333564346230636137373036 Dec 16 02:09:13.380849 containerd[1587]: time="2025-12-16T02:09:13.380749583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cztls,Uid:1c82a029-4ad6-4f52-9c8c-df108579e407,Namespace:calico-system,Attempt:0,} returns sandbox id \"a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b\"" Dec 16 02:09:13.397289 kubelet[2826]: E1216 02:09:13.397095 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.397289 kubelet[2826]: W1216 02:09:13.397121 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.397289 kubelet[2826]: E1216 02:09:13.397142 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.397573 kubelet[2826]: E1216 02:09:13.397558 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.397653 kubelet[2826]: W1216 02:09:13.397638 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.397806 kubelet[2826]: E1216 02:09:13.397745 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.398113 kubelet[2826]: E1216 02:09:13.398070 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.398113 kubelet[2826]: W1216 02:09:13.398089 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.398577 kubelet[2826]: E1216 02:09:13.398526 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.398906 kubelet[2826]: E1216 02:09:13.398887 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.398906 kubelet[2826]: W1216 02:09:13.398903 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.399079 kubelet[2826]: E1216 02:09:13.398917 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.399474 kubelet[2826]: E1216 02:09:13.399411 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.399474 kubelet[2826]: W1216 02:09:13.399424 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.399474 kubelet[2826]: E1216 02:09:13.399436 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.399934 kubelet[2826]: E1216 02:09:13.399911 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.399934 kubelet[2826]: W1216 02:09:13.399927 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.400261 kubelet[2826]: E1216 02:09:13.399941 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.400595 kubelet[2826]: E1216 02:09:13.400354 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.400595 kubelet[2826]: W1216 02:09:13.400373 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.400595 kubelet[2826]: E1216 02:09:13.400386 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.400902 kubelet[2826]: E1216 02:09:13.400882 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.400902 kubelet[2826]: W1216 02:09:13.400898 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.401007 kubelet[2826]: E1216 02:09:13.400912 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.401425 kubelet[2826]: E1216 02:09:13.401400 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.401668 kubelet[2826]: W1216 02:09:13.401505 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.401668 kubelet[2826]: E1216 02:09:13.401529 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.401977 kubelet[2826]: E1216 02:09:13.401936 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.402279 kubelet[2826]: W1216 02:09:13.402082 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.402279 kubelet[2826]: E1216 02:09:13.402105 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.402755 kubelet[2826]: E1216 02:09:13.402678 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.402755 kubelet[2826]: W1216 02:09:13.402719 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.402755 kubelet[2826]: E1216 02:09:13.402737 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.403510 kubelet[2826]: E1216 02:09:13.403480 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.403746 kubelet[2826]: W1216 02:09:13.403618 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.403746 kubelet[2826]: E1216 02:09:13.403647 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.404111 kubelet[2826]: E1216 02:09:13.404094 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.404210 kubelet[2826]: W1216 02:09:13.404196 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.404358 kubelet[2826]: E1216 02:09:13.404289 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.404562 kubelet[2826]: E1216 02:09:13.404550 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.404709 kubelet[2826]: W1216 02:09:13.404626 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.404709 kubelet[2826]: E1216 02:09:13.404644 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.404944 kubelet[2826]: E1216 02:09:13.404932 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.405067 kubelet[2826]: W1216 02:09:13.405009 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.405128 kubelet[2826]: E1216 02:09:13.405116 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.405327 kubelet[2826]: E1216 02:09:13.405315 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.405495 kubelet[2826]: W1216 02:09:13.405395 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.405495 kubelet[2826]: E1216 02:09:13.405413 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.405730 kubelet[2826]: E1216 02:09:13.405715 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.405857 kubelet[2826]: W1216 02:09:13.405789 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.405857 kubelet[2826]: E1216 02:09:13.405806 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.406146 kubelet[2826]: E1216 02:09:13.406128 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.406343 kubelet[2826]: W1216 02:09:13.406231 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.406343 kubelet[2826]: E1216 02:09:13.406286 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.406556 kubelet[2826]: E1216 02:09:13.406543 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.406780 kubelet[2826]: W1216 02:09:13.406617 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.406780 kubelet[2826]: E1216 02:09:13.406637 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.407108 kubelet[2826]: E1216 02:09:13.407094 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.407281 kubelet[2826]: W1216 02:09:13.407178 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.407281 kubelet[2826]: E1216 02:09:13.407196 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.407620 kubelet[2826]: E1216 02:09:13.407609 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.407758 kubelet[2826]: W1216 02:09:13.407693 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.407758 kubelet[2826]: E1216 02:09:13.407710 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.408197 kubelet[2826]: E1216 02:09:13.408181 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.408417 kubelet[2826]: W1216 02:09:13.408277 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.408417 kubelet[2826]: E1216 02:09:13.408297 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.408742 kubelet[2826]: E1216 02:09:13.408725 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.409000 kubelet[2826]: W1216 02:09:13.408897 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.409000 kubelet[2826]: E1216 02:09:13.408917 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.409940 kubelet[2826]: E1216 02:09:13.409797 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.409940 kubelet[2826]: W1216 02:09:13.409809 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.409940 kubelet[2826]: E1216 02:09:13.409825 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.410745 kubelet[2826]: E1216 02:09:13.410728 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.410929 kubelet[2826]: W1216 02:09:13.410831 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.410929 kubelet[2826]: E1216 02:09:13.410848 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.415393 kubelet[2826]: E1216 02:09:13.415289 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:13.415393 kubelet[2826]: W1216 02:09:13.415315 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:13.415393 kubelet[2826]: E1216 02:09:13.415339 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:13.854000 audit[3399]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3399 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:13.854000 audit[3399]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffece89bf0 a2=0 a3=1 items=0 ppid=2986 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.854000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:13.859000 audit[3399]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3399 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:13.859000 audit[3399]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffece89bf0 a2=0 a3=1 items=0 ppid=2986 pid=3399 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:13.859000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:14.448771 kubelet[2826]: E1216 02:09:14.448711 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:14.659090 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1638499075.mount: Deactivated successfully. Dec 16 02:09:15.236399 containerd[1587]: time="2025-12-16T02:09:15.236320340Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:15.238577 containerd[1587]: time="2025-12-16T02:09:15.238509288Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:15.240239 containerd[1587]: time="2025-12-16T02:09:15.240202675Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:15.244230 containerd[1587]: time="2025-12-16T02:09:15.244185922Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:15.245119 containerd[1587]: time="2025-12-16T02:09:15.245074652Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 1.992550847s" Dec 16 02:09:15.245119 containerd[1587]: time="2025-12-16T02:09:15.245112449Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 16 02:09:15.247069 containerd[1587]: time="2025-12-16T02:09:15.246987622Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 02:09:15.265464 containerd[1587]: time="2025-12-16T02:09:15.265418494Z" level=info msg="CreateContainer within sandbox \"b0422c1f88f95aac23ed241384f3d7006bf671ed2312c1be101265da76341c8b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 02:09:15.275274 containerd[1587]: time="2025-12-16T02:09:15.275227124Z" level=info msg="Container eb9c2b17b64ab1421b8b98fc5838218c294d8eb0ada968d576f3f59669c07caa: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:09:15.284332 containerd[1587]: time="2025-12-16T02:09:15.284268974Z" level=info msg="CreateContainer within sandbox \"b0422c1f88f95aac23ed241384f3d7006bf671ed2312c1be101265da76341c8b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"eb9c2b17b64ab1421b8b98fc5838218c294d8eb0ada968d576f3f59669c07caa\"" Dec 16 02:09:15.286211 containerd[1587]: time="2025-12-16T02:09:15.285280534Z" level=info msg="StartContainer for \"eb9c2b17b64ab1421b8b98fc5838218c294d8eb0ada968d576f3f59669c07caa\"" Dec 16 02:09:15.287988 containerd[1587]: time="2025-12-16T02:09:15.287959164Z" level=info msg="connecting to shim eb9c2b17b64ab1421b8b98fc5838218c294d8eb0ada968d576f3f59669c07caa" address="unix:///run/containerd/s/14d80120e117804fa123a6fb2194bbda042613de97df3718d0b32c966749d988" protocol=ttrpc version=3 Dec 16 02:09:15.318421 systemd[1]: Started cri-containerd-eb9c2b17b64ab1421b8b98fc5838218c294d8eb0ada968d576f3f59669c07caa.scope - libcontainer container eb9c2b17b64ab1421b8b98fc5838218c294d8eb0ada968d576f3f59669c07caa. Dec 16 02:09:15.334000 audit: BPF prog-id=161 op=LOAD Dec 16 02:09:15.336122 kernel: kauditd_printk_skb: 58 callbacks suppressed Dec 16 02:09:15.336177 kernel: audit: type=1334 audit(1765850955.334:543): prog-id=161 op=LOAD Dec 16 02:09:15.336000 audit: BPF prog-id=162 op=LOAD Dec 16 02:09:15.336000 audit[3410]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.340681 kernel: audit: type=1334 audit(1765850955.336:544): prog-id=162 op=LOAD Dec 16 02:09:15.340763 kernel: audit: type=1300 audit(1765850955.336:544): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.344212 kernel: audit: type=1327 audit(1765850955.336:544): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.344328 kernel: audit: type=1334 audit(1765850955.339:545): prog-id=162 op=UNLOAD Dec 16 02:09:15.339000 audit: BPF prog-id=162 op=UNLOAD Dec 16 02:09:15.339000 audit[3410]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.347915 kernel: audit: type=1300 audit(1765850955.339:545): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.350271 kernel: audit: type=1327 audit(1765850955.339:545): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.350326 kernel: audit: type=1334 audit(1765850955.339:546): prog-id=163 op=LOAD Dec 16 02:09:15.339000 audit: BPF prog-id=163 op=LOAD Dec 16 02:09:15.339000 audit[3410]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.352922 kernel: audit: type=1300 audit(1765850955.339:546): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.353051 kernel: audit: type=1327 audit(1765850955.339:546): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.339000 audit: BPF prog-id=164 op=LOAD Dec 16 02:09:15.339000 audit[3410]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.339000 audit: BPF prog-id=164 op=UNLOAD Dec 16 02:09:15.339000 audit[3410]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.339000 audit: BPF prog-id=163 op=UNLOAD Dec 16 02:09:15.339000 audit[3410]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.339000 audit: BPF prog-id=165 op=LOAD Dec 16 02:09:15.339000 audit[3410]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3245 pid=3410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:15.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562396332623137623634616231343231623862393866633538333832 Dec 16 02:09:15.374374 containerd[1587]: time="2025-12-16T02:09:15.374290864Z" level=info msg="StartContainer for \"eb9c2b17b64ab1421b8b98fc5838218c294d8eb0ada968d576f3f59669c07caa\" returns successfully" Dec 16 02:09:15.581240 kubelet[2826]: E1216 02:09:15.580986 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.581240 kubelet[2826]: W1216 02:09:15.581014 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.583094 kubelet[2826]: E1216 02:09:15.581322 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.583094 kubelet[2826]: E1216 02:09:15.581576 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.583094 kubelet[2826]: W1216 02:09:15.581587 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.583094 kubelet[2826]: E1216 02:09:15.582087 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.583094 kubelet[2826]: E1216 02:09:15.582540 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.583094 kubelet[2826]: W1216 02:09:15.582552 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.583094 kubelet[2826]: E1216 02:09:15.582565 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.584726 kubelet[2826]: E1216 02:09:15.584703 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.584726 kubelet[2826]: W1216 02:09:15.584720 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.585135 kubelet[2826]: E1216 02:09:15.584733 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.585387 kubelet[2826]: E1216 02:09:15.585329 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.585387 kubelet[2826]: W1216 02:09:15.585345 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.585387 kubelet[2826]: E1216 02:09:15.585357 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.585561 kubelet[2826]: E1216 02:09:15.585545 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.585561 kubelet[2826]: W1216 02:09:15.585553 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.585601 kubelet[2826]: E1216 02:09:15.585563 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.586220 kubelet[2826]: E1216 02:09:15.585684 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.586220 kubelet[2826]: W1216 02:09:15.585699 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.586220 kubelet[2826]: E1216 02:09:15.585707 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.586220 kubelet[2826]: E1216 02:09:15.585924 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.586220 kubelet[2826]: W1216 02:09:15.585933 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.586220 kubelet[2826]: E1216 02:09:15.585943 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.586379 kubelet[2826]: E1216 02:09:15.586259 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.586379 kubelet[2826]: W1216 02:09:15.586271 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.586379 kubelet[2826]: E1216 02:09:15.586280 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.587671 kubelet[2826]: E1216 02:09:15.587157 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.587671 kubelet[2826]: W1216 02:09:15.587208 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.587671 kubelet[2826]: E1216 02:09:15.587222 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.587671 kubelet[2826]: E1216 02:09:15.587398 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.587671 kubelet[2826]: W1216 02:09:15.587407 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.587671 kubelet[2826]: E1216 02:09:15.587416 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.587671 kubelet[2826]: E1216 02:09:15.587588 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.587671 kubelet[2826]: W1216 02:09:15.587596 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.587671 kubelet[2826]: E1216 02:09:15.587604 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.587947 kubelet[2826]: E1216 02:09:15.587804 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.587947 kubelet[2826]: W1216 02:09:15.587813 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.587947 kubelet[2826]: E1216 02:09:15.587821 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.588116 kubelet[2826]: E1216 02:09:15.588095 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.588116 kubelet[2826]: W1216 02:09:15.588111 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.588178 kubelet[2826]: E1216 02:09:15.588122 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.589066 kubelet[2826]: E1216 02:09:15.588259 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.589066 kubelet[2826]: W1216 02:09:15.588272 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.589066 kubelet[2826]: E1216 02:09:15.588281 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.618932 kubelet[2826]: E1216 02:09:15.618828 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.618932 kubelet[2826]: W1216 02:09:15.618864 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.618932 kubelet[2826]: E1216 02:09:15.618886 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.620385 kubelet[2826]: E1216 02:09:15.620348 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.620385 kubelet[2826]: W1216 02:09:15.620366 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.620385 kubelet[2826]: E1216 02:09:15.620381 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.622121 kubelet[2826]: E1216 02:09:15.622095 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.622121 kubelet[2826]: W1216 02:09:15.622113 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.622121 kubelet[2826]: E1216 02:09:15.622129 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.622379 kubelet[2826]: E1216 02:09:15.622361 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.622379 kubelet[2826]: W1216 02:09:15.622373 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.622499 kubelet[2826]: E1216 02:09:15.622382 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.622584 kubelet[2826]: E1216 02:09:15.622566 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.622584 kubelet[2826]: W1216 02:09:15.622580 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.622670 kubelet[2826]: E1216 02:09:15.622589 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.622755 kubelet[2826]: E1216 02:09:15.622741 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.622755 kubelet[2826]: W1216 02:09:15.622752 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.622832 kubelet[2826]: E1216 02:09:15.622761 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.622899 kubelet[2826]: E1216 02:09:15.622885 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.622899 kubelet[2826]: W1216 02:09:15.622895 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.622972 kubelet[2826]: E1216 02:09:15.622904 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.623093 kubelet[2826]: E1216 02:09:15.623075 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.623093 kubelet[2826]: W1216 02:09:15.623088 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.623199 kubelet[2826]: E1216 02:09:15.623096 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.623268 kubelet[2826]: E1216 02:09:15.623256 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.623268 kubelet[2826]: W1216 02:09:15.623266 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.623320 kubelet[2826]: E1216 02:09:15.623274 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.623741 kubelet[2826]: E1216 02:09:15.623657 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.623741 kubelet[2826]: W1216 02:09:15.623668 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.623741 kubelet[2826]: E1216 02:09:15.623677 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.623884 kubelet[2826]: E1216 02:09:15.623827 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.623884 kubelet[2826]: W1216 02:09:15.623835 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.623884 kubelet[2826]: E1216 02:09:15.623842 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.624109 kubelet[2826]: E1216 02:09:15.624090 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.624109 kubelet[2826]: W1216 02:09:15.624103 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.624199 kubelet[2826]: E1216 02:09:15.624115 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.624380 kubelet[2826]: E1216 02:09:15.624360 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.624380 kubelet[2826]: W1216 02:09:15.624375 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.624476 kubelet[2826]: E1216 02:09:15.624387 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.624779 kubelet[2826]: E1216 02:09:15.624707 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.624779 kubelet[2826]: W1216 02:09:15.624719 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.624779 kubelet[2826]: E1216 02:09:15.624729 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.625637 kubelet[2826]: E1216 02:09:15.625215 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.625637 kubelet[2826]: W1216 02:09:15.625566 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.625637 kubelet[2826]: E1216 02:09:15.625586 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.626504 kubelet[2826]: E1216 02:09:15.626369 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.626593 kubelet[2826]: W1216 02:09:15.626578 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.626810 kubelet[2826]: E1216 02:09:15.626635 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.627039 kubelet[2826]: E1216 02:09:15.627005 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.627084 kubelet[2826]: W1216 02:09:15.627065 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.627084 kubelet[2826]: E1216 02:09:15.627079 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:15.628786 kubelet[2826]: E1216 02:09:15.628764 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:15.628915 kubelet[2826]: W1216 02:09:15.628859 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:15.628915 kubelet[2826]: E1216 02:09:15.628881 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.448360 kubelet[2826]: E1216 02:09:16.448074 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:16.564606 kubelet[2826]: I1216 02:09:16.564537 2826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 02:09:16.596885 kubelet[2826]: E1216 02:09:16.596821 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.597746 kubelet[2826]: W1216 02:09:16.596857 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.597746 kubelet[2826]: E1216 02:09:16.597127 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.598638 kubelet[2826]: E1216 02:09:16.598346 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.598638 kubelet[2826]: W1216 02:09:16.598417 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.598638 kubelet[2826]: E1216 02:09:16.598452 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.599139 kubelet[2826]: E1216 02:09:16.599114 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.599520 kubelet[2826]: W1216 02:09:16.599437 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.599660 kubelet[2826]: E1216 02:09:16.599493 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.600010 kubelet[2826]: E1216 02:09:16.599988 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.600269 kubelet[2826]: W1216 02:09:16.600136 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.600269 kubelet[2826]: E1216 02:09:16.600158 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.600469 kubelet[2826]: E1216 02:09:16.600457 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.600608 kubelet[2826]: W1216 02:09:16.600536 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.600608 kubelet[2826]: E1216 02:09:16.600553 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.600844 kubelet[2826]: E1216 02:09:16.600804 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.600844 kubelet[2826]: W1216 02:09:16.600818 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.600844 kubelet[2826]: E1216 02:09:16.600828 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.601205 kubelet[2826]: E1216 02:09:16.601152 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.601205 kubelet[2826]: W1216 02:09:16.601164 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.601205 kubelet[2826]: E1216 02:09:16.601174 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.601574 kubelet[2826]: E1216 02:09:16.601557 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.601758 kubelet[2826]: W1216 02:09:16.601638 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.601758 kubelet[2826]: E1216 02:09:16.601654 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.601909 kubelet[2826]: E1216 02:09:16.601897 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.601973 kubelet[2826]: W1216 02:09:16.601963 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.602065 kubelet[2826]: E1216 02:09:16.602021 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.602424 kubelet[2826]: E1216 02:09:16.602319 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.602424 kubelet[2826]: W1216 02:09:16.602330 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.602424 kubelet[2826]: E1216 02:09:16.602340 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.602614 kubelet[2826]: E1216 02:09:16.602601 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.602672 kubelet[2826]: W1216 02:09:16.602661 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.602803 kubelet[2826]: E1216 02:09:16.602714 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.602913 kubelet[2826]: E1216 02:09:16.602901 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.603061 kubelet[2826]: W1216 02:09:16.602965 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.603061 kubelet[2826]: E1216 02:09:16.602979 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.603432 kubelet[2826]: E1216 02:09:16.603295 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.603432 kubelet[2826]: W1216 02:09:16.603306 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.603432 kubelet[2826]: E1216 02:09:16.603316 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.603600 kubelet[2826]: E1216 02:09:16.603588 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.603660 kubelet[2826]: W1216 02:09:16.603649 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.603718 kubelet[2826]: E1216 02:09:16.603707 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.604022 kubelet[2826]: E1216 02:09:16.603932 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.604022 kubelet[2826]: W1216 02:09:16.603951 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.604022 kubelet[2826]: E1216 02:09:16.603961 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.630530 kubelet[2826]: E1216 02:09:16.630488 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.630530 kubelet[2826]: W1216 02:09:16.630516 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.630530 kubelet[2826]: E1216 02:09:16.630540 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.630955 kubelet[2826]: E1216 02:09:16.630910 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.630955 kubelet[2826]: W1216 02:09:16.630934 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.630955 kubelet[2826]: E1216 02:09:16.630954 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.631422 kubelet[2826]: E1216 02:09:16.631354 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.631422 kubelet[2826]: W1216 02:09:16.631387 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.631422 kubelet[2826]: E1216 02:09:16.631407 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.631760 kubelet[2826]: E1216 02:09:16.631728 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.631844 kubelet[2826]: W1216 02:09:16.631763 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.631844 kubelet[2826]: E1216 02:09:16.631780 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.632076 kubelet[2826]: E1216 02:09:16.632046 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.632076 kubelet[2826]: W1216 02:09:16.632067 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.632076 kubelet[2826]: E1216 02:09:16.632081 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.632531 kubelet[2826]: E1216 02:09:16.632484 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.632531 kubelet[2826]: W1216 02:09:16.632501 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.632531 kubelet[2826]: E1216 02:09:16.632517 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.632816 kubelet[2826]: E1216 02:09:16.632780 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.632816 kubelet[2826]: W1216 02:09:16.632801 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.632816 kubelet[2826]: E1216 02:09:16.632815 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.633115 kubelet[2826]: E1216 02:09:16.633100 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.633115 kubelet[2826]: W1216 02:09:16.633116 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.633291 kubelet[2826]: E1216 02:09:16.633130 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.633502 kubelet[2826]: E1216 02:09:16.633465 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.633502 kubelet[2826]: W1216 02:09:16.633480 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.633502 kubelet[2826]: E1216 02:09:16.633495 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.634008 kubelet[2826]: E1216 02:09:16.633986 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.634127 kubelet[2826]: W1216 02:09:16.634053 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.634127 kubelet[2826]: E1216 02:09:16.634071 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.634332 kubelet[2826]: E1216 02:09:16.634312 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.634332 kubelet[2826]: W1216 02:09:16.634332 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.634526 kubelet[2826]: E1216 02:09:16.634347 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.634605 kubelet[2826]: E1216 02:09:16.634582 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.634661 kubelet[2826]: W1216 02:09:16.634599 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.634661 kubelet[2826]: E1216 02:09:16.634655 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.634885 kubelet[2826]: E1216 02:09:16.634857 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.634885 kubelet[2826]: W1216 02:09:16.634880 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.634998 kubelet[2826]: E1216 02:09:16.634890 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.635151 kubelet[2826]: E1216 02:09:16.635129 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.635151 kubelet[2826]: W1216 02:09:16.635144 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.635151 kubelet[2826]: E1216 02:09:16.635153 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.635422 kubelet[2826]: E1216 02:09:16.635362 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.635422 kubelet[2826]: W1216 02:09:16.635387 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.635422 kubelet[2826]: E1216 02:09:16.635398 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.635659 kubelet[2826]: E1216 02:09:16.635604 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.635659 kubelet[2826]: W1216 02:09:16.635614 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.635659 kubelet[2826]: E1216 02:09:16.635623 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.636193 kubelet[2826]: E1216 02:09:16.636144 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.636193 kubelet[2826]: W1216 02:09:16.636161 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.636193 kubelet[2826]: E1216 02:09:16.636172 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.636354 kubelet[2826]: E1216 02:09:16.636332 2826 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 02:09:16.636354 kubelet[2826]: W1216 02:09:16.636340 2826 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 02:09:16.636354 kubelet[2826]: E1216 02:09:16.636347 2826 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 02:09:16.860658 containerd[1587]: time="2025-12-16T02:09:16.860008247Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:16.862726 containerd[1587]: time="2025-12-16T02:09:16.862529981Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:16.864464 containerd[1587]: time="2025-12-16T02:09:16.863859843Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:16.868911 containerd[1587]: time="2025-12-16T02:09:16.868850115Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:16.870009 containerd[1587]: time="2025-12-16T02:09:16.869959513Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.622881099s" Dec 16 02:09:16.870009 containerd[1587]: time="2025-12-16T02:09:16.870007109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 16 02:09:16.877838 containerd[1587]: time="2025-12-16T02:09:16.877793815Z" level=info msg="CreateContainer within sandbox \"a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 02:09:16.890666 containerd[1587]: time="2025-12-16T02:09:16.890199059Z" level=info msg="Container dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:09:16.906621 containerd[1587]: time="2025-12-16T02:09:16.906564771Z" level=info msg="CreateContainer within sandbox \"a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c\"" Dec 16 02:09:16.907396 containerd[1587]: time="2025-12-16T02:09:16.907326355Z" level=info msg="StartContainer for \"dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c\"" Dec 16 02:09:16.911272 containerd[1587]: time="2025-12-16T02:09:16.911214108Z" level=info msg="connecting to shim dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c" address="unix:///run/containerd/s/495e23fe1e89345cd847743a52e6ce3721eef37a4b3d90c1569e30f042bdfdd4" protocol=ttrpc version=3 Dec 16 02:09:16.937294 systemd[1]: Started cri-containerd-dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c.scope - libcontainer container dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c. Dec 16 02:09:16.981000 audit: BPF prog-id=166 op=LOAD Dec 16 02:09:16.981000 audit[3518]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3336 pid=3518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:16.981000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462636663393732633731633833643366663466326265653439313939 Dec 16 02:09:16.981000 audit: BPF prog-id=167 op=LOAD Dec 16 02:09:16.981000 audit[3518]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3336 pid=3518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:16.981000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462636663393732633731633833643366663466326265653439313939 Dec 16 02:09:16.981000 audit: BPF prog-id=167 op=UNLOAD Dec 16 02:09:16.981000 audit[3518]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:16.981000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462636663393732633731633833643366663466326265653439313939 Dec 16 02:09:16.981000 audit: BPF prog-id=166 op=UNLOAD Dec 16 02:09:16.981000 audit[3518]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:16.981000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462636663393732633731633833643366663466326265653439313939 Dec 16 02:09:16.981000 audit: BPF prog-id=168 op=LOAD Dec 16 02:09:16.981000 audit[3518]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3336 pid=3518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:16.981000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462636663393732633731633833643366663466326265653439313939 Dec 16 02:09:17.007014 containerd[1587]: time="2025-12-16T02:09:17.006977348Z" level=info msg="StartContainer for \"dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c\" returns successfully" Dec 16 02:09:17.023274 systemd[1]: cri-containerd-dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c.scope: Deactivated successfully. Dec 16 02:09:17.026000 audit: BPF prog-id=168 op=UNLOAD Dec 16 02:09:17.028086 containerd[1587]: time="2025-12-16T02:09:17.028052049Z" level=info msg="received container exit event container_id:\"dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c\" id:\"dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c\" pid:3531 exited_at:{seconds:1765850957 nanos:26918447}" Dec 16 02:09:17.049894 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dbcfc972c71c83d3ff4f2bee4919912fa414698d79ce647b3825b6410290709c-rootfs.mount: Deactivated successfully. Dec 16 02:09:17.574331 containerd[1587]: time="2025-12-16T02:09:17.574216402Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 02:09:17.600311 kubelet[2826]: I1216 02:09:17.598541 2826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-55485b9c66-8zn8k" podStartSLOduration=3.603635786 podStartE2EDuration="5.5985182s" podCreationTimestamp="2025-12-16 02:09:12 +0000 UTC" firstStartedPulling="2025-12-16 02:09:13.251616406 +0000 UTC m=+27.968881448" lastFinishedPulling="2025-12-16 02:09:15.24649886 +0000 UTC m=+29.963763862" observedRunningTime="2025-12-16 02:09:15.608851483 +0000 UTC m=+30.326116525" watchObservedRunningTime="2025-12-16 02:09:17.5985182 +0000 UTC m=+32.315783242" Dec 16 02:09:18.447912 kubelet[2826]: E1216 02:09:18.447850 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:20.175733 containerd[1587]: time="2025-12-16T02:09:20.175668518Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:20.177509 containerd[1587]: time="2025-12-16T02:09:20.177448497Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Dec 16 02:09:20.178461 containerd[1587]: time="2025-12-16T02:09:20.178405843Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:20.182054 containerd[1587]: time="2025-12-16T02:09:20.181687339Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:20.182338 containerd[1587]: time="2025-12-16T02:09:20.182184471Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.607917232s" Dec 16 02:09:20.182338 containerd[1587]: time="2025-12-16T02:09:20.182312263Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 16 02:09:20.189831 containerd[1587]: time="2025-12-16T02:09:20.189784483Z" level=info msg="CreateContainer within sandbox \"a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 02:09:20.204356 containerd[1587]: time="2025-12-16T02:09:20.203368158Z" level=info msg="Container 3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:09:20.213650 containerd[1587]: time="2025-12-16T02:09:20.213560744Z" level=info msg="CreateContainer within sandbox \"a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83\"" Dec 16 02:09:20.214390 containerd[1587]: time="2025-12-16T02:09:20.214329620Z" level=info msg="StartContainer for \"3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83\"" Dec 16 02:09:20.218500 containerd[1587]: time="2025-12-16T02:09:20.217378489Z" level=info msg="connecting to shim 3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83" address="unix:///run/containerd/s/495e23fe1e89345cd847743a52e6ce3721eef37a4b3d90c1569e30f042bdfdd4" protocol=ttrpc version=3 Dec 16 02:09:20.246494 systemd[1]: Started cri-containerd-3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83.scope - libcontainer container 3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83. Dec 16 02:09:20.326000 audit: BPF prog-id=169 op=LOAD Dec 16 02:09:20.326000 audit[3579]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3336 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:20.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365646631376535353366383563613861636637613636643832666565 Dec 16 02:09:20.326000 audit: BPF prog-id=170 op=LOAD Dec 16 02:09:20.326000 audit[3579]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3336 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:20.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365646631376535353366383563613861636637613636643832666565 Dec 16 02:09:20.326000 audit: BPF prog-id=170 op=UNLOAD Dec 16 02:09:20.326000 audit[3579]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:20.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365646631376535353366383563613861636637613636643832666565 Dec 16 02:09:20.326000 audit: BPF prog-id=169 op=UNLOAD Dec 16 02:09:20.326000 audit[3579]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:20.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365646631376535353366383563613861636637613636643832666565 Dec 16 02:09:20.326000 audit: BPF prog-id=171 op=LOAD Dec 16 02:09:20.326000 audit[3579]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3336 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:20.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365646631376535353366383563613861636637613636643832666565 Dec 16 02:09:20.351911 containerd[1587]: time="2025-12-16T02:09:20.351832637Z" level=info msg="StartContainer for \"3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83\" returns successfully" Dec 16 02:09:20.449464 kubelet[2826]: E1216 02:09:20.448968 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:20.944302 containerd[1587]: time="2025-12-16T02:09:20.944237314Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 02:09:20.947642 systemd[1]: cri-containerd-3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83.scope: Deactivated successfully. Dec 16 02:09:20.948360 systemd[1]: cri-containerd-3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83.scope: Consumed 563ms CPU time, 188.7M memory peak, 165.9M written to disk. Dec 16 02:09:20.951221 containerd[1587]: time="2025-12-16T02:09:20.950101064Z" level=info msg="received container exit event container_id:\"3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83\" id:\"3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83\" pid:3591 exited_at:{seconds:1765850960 nanos:949450541}" Dec 16 02:09:20.950000 audit: BPF prog-id=171 op=UNLOAD Dec 16 02:09:20.953266 kernel: kauditd_printk_skb: 43 callbacks suppressed Dec 16 02:09:20.953356 kernel: audit: type=1334 audit(1765850960.950:562): prog-id=171 op=UNLOAD Dec 16 02:09:20.958139 kubelet[2826]: I1216 02:09:20.957796 2826 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 16 02:09:20.990643 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3edf17e553f85ca8acf7a66d82fee3e40f186ba35770ca107b8038a12f7b0a83-rootfs.mount: Deactivated successfully. Dec 16 02:09:21.022015 systemd[1]: Created slice kubepods-burstable-pod3d6c2c9e_73af_4dcd_8b45_11a259f9bc11.slice - libcontainer container kubepods-burstable-pod3d6c2c9e_73af_4dcd_8b45_11a259f9bc11.slice. Dec 16 02:09:21.059345 systemd[1]: Created slice kubepods-besteffort-pod4f109e8a_b6cd_4daa_a636_a987203ce9dc.slice - libcontainer container kubepods-besteffort-pod4f109e8a_b6cd_4daa_a636_a987203ce9dc.slice. Dec 16 02:09:21.060474 kubelet[2826]: I1216 02:09:21.059805 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d6c2c9e-73af-4dcd-8b45-11a259f9bc11-config-volume\") pod \"coredns-66bc5c9577-572qq\" (UID: \"3d6c2c9e-73af-4dcd-8b45-11a259f9bc11\") " pod="kube-system/coredns-66bc5c9577-572qq" Dec 16 02:09:21.063085 kubelet[2826]: I1216 02:09:21.062009 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8hdh\" (UniqueName: \"kubernetes.io/projected/3d6c2c9e-73af-4dcd-8b45-11a259f9bc11-kube-api-access-l8hdh\") pod \"coredns-66bc5c9577-572qq\" (UID: \"3d6c2c9e-73af-4dcd-8b45-11a259f9bc11\") " pod="kube-system/coredns-66bc5c9577-572qq" Dec 16 02:09:21.087206 systemd[1]: Created slice kubepods-besteffort-poda4f3ee57_42ce_4008_b96e_85199f6fd632.slice - libcontainer container kubepods-besteffort-poda4f3ee57_42ce_4008_b96e_85199f6fd632.slice. Dec 16 02:09:21.103665 systemd[1]: Created slice kubepods-besteffort-pod24ee6e1b_64f9_47f5_86c3_17009d2e74c9.slice - libcontainer container kubepods-besteffort-pod24ee6e1b_64f9_47f5_86c3_17009d2e74c9.slice. Dec 16 02:09:21.116387 systemd[1]: Created slice kubepods-burstable-podee5c1b74_6a31_486f_9498_a4be28b35a8a.slice - libcontainer container kubepods-burstable-podee5c1b74_6a31_486f_9498_a4be28b35a8a.slice. Dec 16 02:09:21.127360 systemd[1]: Created slice kubepods-besteffort-pod81662439_533e_48ac_902b_f1b932cc9432.slice - libcontainer container kubepods-besteffort-pod81662439_533e_48ac_902b_f1b932cc9432.slice. Dec 16 02:09:21.138959 systemd[1]: Created slice kubepods-besteffort-pode4bfcf46_398e_437f_b7f1_81589479eeb2.slice - libcontainer container kubepods-besteffort-pode4bfcf46_398e_437f_b7f1_81589479eeb2.slice. Dec 16 02:09:21.164356 kubelet[2826]: I1216 02:09:21.164303 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/e4bfcf46-398e-437f-b7f1-81589479eeb2-goldmane-key-pair\") pod \"goldmane-7c778bb748-cl8m7\" (UID: \"e4bfcf46-398e-437f-b7f1-81589479eeb2\") " pod="calico-system/goldmane-7c778bb748-cl8m7" Dec 16 02:09:21.164356 kubelet[2826]: I1216 02:09:21.164345 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx9tt\" (UniqueName: \"kubernetes.io/projected/e4bfcf46-398e-437f-b7f1-81589479eeb2-kube-api-access-fx9tt\") pod \"goldmane-7c778bb748-cl8m7\" (UID: \"e4bfcf46-398e-437f-b7f1-81589479eeb2\") " pod="calico-system/goldmane-7c778bb748-cl8m7" Dec 16 02:09:21.164356 kubelet[2826]: I1216 02:09:21.164363 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z62t\" (UniqueName: \"kubernetes.io/projected/24ee6e1b-64f9-47f5-86c3-17009d2e74c9-kube-api-access-6z62t\") pod \"calico-apiserver-86cf67c95b-xtmz8\" (UID: \"24ee6e1b-64f9-47f5-86c3-17009d2e74c9\") " pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" Dec 16 02:09:21.164921 kubelet[2826]: I1216 02:09:21.164383 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w45x\" (UniqueName: \"kubernetes.io/projected/4f109e8a-b6cd-4daa-a636-a987203ce9dc-kube-api-access-9w45x\") pod \"calico-apiserver-86cf67c95b-68pm7\" (UID: \"4f109e8a-b6cd-4daa-a636-a987203ce9dc\") " pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" Dec 16 02:09:21.164921 kubelet[2826]: I1216 02:09:21.164398 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/24ee6e1b-64f9-47f5-86c3-17009d2e74c9-calico-apiserver-certs\") pod \"calico-apiserver-86cf67c95b-xtmz8\" (UID: \"24ee6e1b-64f9-47f5-86c3-17009d2e74c9\") " pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" Dec 16 02:09:21.164921 kubelet[2826]: I1216 02:09:21.164415 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee5c1b74-6a31-486f-9498-a4be28b35a8a-config-volume\") pod \"coredns-66bc5c9577-9kfwg\" (UID: \"ee5c1b74-6a31-486f-9498-a4be28b35a8a\") " pod="kube-system/coredns-66bc5c9577-9kfwg" Dec 16 02:09:21.164921 kubelet[2826]: I1216 02:09:21.164434 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/81662439-533e-48ac-902b-f1b932cc9432-whisker-backend-key-pair\") pod \"whisker-68484cb5f8-k5sjh\" (UID: \"81662439-533e-48ac-902b-f1b932cc9432\") " pod="calico-system/whisker-68484cb5f8-k5sjh" Dec 16 02:09:21.164921 kubelet[2826]: I1216 02:09:21.164451 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4f3ee57-42ce-4008-b96e-85199f6fd632-tigera-ca-bundle\") pod \"calico-kube-controllers-75666888-t2jlw\" (UID: \"a4f3ee57-42ce-4008-b96e-85199f6fd632\") " pod="calico-system/calico-kube-controllers-75666888-t2jlw" Dec 16 02:09:21.165350 kubelet[2826]: I1216 02:09:21.164509 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4bfcf46-398e-437f-b7f1-81589479eeb2-config\") pod \"goldmane-7c778bb748-cl8m7\" (UID: \"e4bfcf46-398e-437f-b7f1-81589479eeb2\") " pod="calico-system/goldmane-7c778bb748-cl8m7" Dec 16 02:09:21.165350 kubelet[2826]: I1216 02:09:21.164524 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4bfcf46-398e-437f-b7f1-81589479eeb2-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-cl8m7\" (UID: \"e4bfcf46-398e-437f-b7f1-81589479eeb2\") " pod="calico-system/goldmane-7c778bb748-cl8m7" Dec 16 02:09:21.165350 kubelet[2826]: I1216 02:09:21.164568 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81662439-533e-48ac-902b-f1b932cc9432-whisker-ca-bundle\") pod \"whisker-68484cb5f8-k5sjh\" (UID: \"81662439-533e-48ac-902b-f1b932cc9432\") " pod="calico-system/whisker-68484cb5f8-k5sjh" Dec 16 02:09:21.166338 kubelet[2826]: I1216 02:09:21.166290 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4zc\" (UniqueName: \"kubernetes.io/projected/ee5c1b74-6a31-486f-9498-a4be28b35a8a-kube-api-access-vt4zc\") pod \"coredns-66bc5c9577-9kfwg\" (UID: \"ee5c1b74-6a31-486f-9498-a4be28b35a8a\") " pod="kube-system/coredns-66bc5c9577-9kfwg" Dec 16 02:09:21.166338 kubelet[2826]: I1216 02:09:21.166340 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-484g2\" (UniqueName: \"kubernetes.io/projected/a4f3ee57-42ce-4008-b96e-85199f6fd632-kube-api-access-484g2\") pod \"calico-kube-controllers-75666888-t2jlw\" (UID: \"a4f3ee57-42ce-4008-b96e-85199f6fd632\") " pod="calico-system/calico-kube-controllers-75666888-t2jlw" Dec 16 02:09:21.166511 kubelet[2826]: I1216 02:09:21.166377 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smhcj\" (UniqueName: \"kubernetes.io/projected/81662439-533e-48ac-902b-f1b932cc9432-kube-api-access-smhcj\") pod \"whisker-68484cb5f8-k5sjh\" (UID: \"81662439-533e-48ac-902b-f1b932cc9432\") " pod="calico-system/whisker-68484cb5f8-k5sjh" Dec 16 02:09:21.166511 kubelet[2826]: I1216 02:09:21.166399 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4f109e8a-b6cd-4daa-a636-a987203ce9dc-calico-apiserver-certs\") pod \"calico-apiserver-86cf67c95b-68pm7\" (UID: \"4f109e8a-b6cd-4daa-a636-a987203ce9dc\") " pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" Dec 16 02:09:21.337201 containerd[1587]: time="2025-12-16T02:09:21.336983352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-572qq,Uid:3d6c2c9e-73af-4dcd-8b45-11a259f9bc11,Namespace:kube-system,Attempt:0,}" Dec 16 02:09:21.375925 containerd[1587]: time="2025-12-16T02:09:21.375885158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cf67c95b-68pm7,Uid:4f109e8a-b6cd-4daa-a636-a987203ce9dc,Namespace:calico-apiserver,Attempt:0,}" Dec 16 02:09:21.402910 containerd[1587]: time="2025-12-16T02:09:21.402836189Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75666888-t2jlw,Uid:a4f3ee57-42ce-4008-b96e-85199f6fd632,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:21.416407 containerd[1587]: time="2025-12-16T02:09:21.416370641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cf67c95b-xtmz8,Uid:24ee6e1b-64f9-47f5-86c3-17009d2e74c9,Namespace:calico-apiserver,Attempt:0,}" Dec 16 02:09:21.424844 containerd[1587]: time="2025-12-16T02:09:21.424798121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9kfwg,Uid:ee5c1b74-6a31-486f-9498-a4be28b35a8a,Namespace:kube-system,Attempt:0,}" Dec 16 02:09:21.445301 containerd[1587]: time="2025-12-16T02:09:21.444890550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68484cb5f8-k5sjh,Uid:81662439-533e-48ac-902b-f1b932cc9432,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:21.449290 containerd[1587]: time="2025-12-16T02:09:21.449170846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-cl8m7,Uid:e4bfcf46-398e-437f-b7f1-81589479eeb2,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:21.484527 containerd[1587]: time="2025-12-16T02:09:21.484389965Z" level=error msg="Failed to destroy network for sandbox \"034f3c91c2cb72e357da80d07d2dce0a73ad83fc4120831c6c6fc11e0b515f44\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.490902 containerd[1587]: time="2025-12-16T02:09:21.490841548Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-572qq,Uid:3d6c2c9e-73af-4dcd-8b45-11a259f9bc11,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"034f3c91c2cb72e357da80d07d2dce0a73ad83fc4120831c6c6fc11e0b515f44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.491788 kubelet[2826]: E1216 02:09:21.491116 2826 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"034f3c91c2cb72e357da80d07d2dce0a73ad83fc4120831c6c6fc11e0b515f44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.491788 kubelet[2826]: E1216 02:09:21.491185 2826 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"034f3c91c2cb72e357da80d07d2dce0a73ad83fc4120831c6c6fc11e0b515f44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-572qq" Dec 16 02:09:21.491788 kubelet[2826]: E1216 02:09:21.491205 2826 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"034f3c91c2cb72e357da80d07d2dce0a73ad83fc4120831c6c6fc11e0b515f44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-572qq" Dec 16 02:09:21.492786 kubelet[2826]: E1216 02:09:21.491255 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-572qq_kube-system(3d6c2c9e-73af-4dcd-8b45-11a259f9bc11)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-572qq_kube-system(3d6c2c9e-73af-4dcd-8b45-11a259f9bc11)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"034f3c91c2cb72e357da80d07d2dce0a73ad83fc4120831c6c6fc11e0b515f44\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-572qq" podUID="3d6c2c9e-73af-4dcd-8b45-11a259f9bc11" Dec 16 02:09:21.541764 containerd[1587]: time="2025-12-16T02:09:21.541579015Z" level=error msg="Failed to destroy network for sandbox \"c67d2bcb894c9099af36d1040092df9df1f0f2e649436dc8de7db32cf65a6fb5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.550399 containerd[1587]: time="2025-12-16T02:09:21.550340477Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cf67c95b-68pm7,Uid:4f109e8a-b6cd-4daa-a636-a987203ce9dc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c67d2bcb894c9099af36d1040092df9df1f0f2e649436dc8de7db32cf65a6fb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.551835 kubelet[2826]: E1216 02:09:21.550588 2826 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c67d2bcb894c9099af36d1040092df9df1f0f2e649436dc8de7db32cf65a6fb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.551835 kubelet[2826]: E1216 02:09:21.550654 2826 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c67d2bcb894c9099af36d1040092df9df1f0f2e649436dc8de7db32cf65a6fb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" Dec 16 02:09:21.551835 kubelet[2826]: E1216 02:09:21.550674 2826 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c67d2bcb894c9099af36d1040092df9df1f0f2e649436dc8de7db32cf65a6fb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" Dec 16 02:09:21.553101 kubelet[2826]: E1216 02:09:21.550742 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-86cf67c95b-68pm7_calico-apiserver(4f109e8a-b6cd-4daa-a636-a987203ce9dc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-86cf67c95b-68pm7_calico-apiserver(4f109e8a-b6cd-4daa-a636-a987203ce9dc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c67d2bcb894c9099af36d1040092df9df1f0f2e649436dc8de7db32cf65a6fb5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:09:21.587844 containerd[1587]: time="2025-12-16T02:09:21.587726323Z" level=error msg="Failed to destroy network for sandbox \"9ab569694511abaf723161b2a2fd9ef1e0454490d58d9219c2eebf64a623b9d0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.590946 containerd[1587]: time="2025-12-16T02:09:21.590888477Z" level=error msg="Failed to destroy network for sandbox \"a41a093feb0a42da5d1a491b0f75338b65834ab96daee40368f300ce74c44912\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.593505 containerd[1587]: time="2025-12-16T02:09:21.593324670Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75666888-t2jlw,Uid:a4f3ee57-42ce-4008-b96e-85199f6fd632,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ab569694511abaf723161b2a2fd9ef1e0454490d58d9219c2eebf64a623b9d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.594090 kubelet[2826]: E1216 02:09:21.593828 2826 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ab569694511abaf723161b2a2fd9ef1e0454490d58d9219c2eebf64a623b9d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.594090 kubelet[2826]: E1216 02:09:21.593883 2826 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ab569694511abaf723161b2a2fd9ef1e0454490d58d9219c2eebf64a623b9d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75666888-t2jlw" Dec 16 02:09:21.594090 kubelet[2826]: E1216 02:09:21.593903 2826 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ab569694511abaf723161b2a2fd9ef1e0454490d58d9219c2eebf64a623b9d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75666888-t2jlw" Dec 16 02:09:21.594358 kubelet[2826]: E1216 02:09:21.593958 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-75666888-t2jlw_calico-system(a4f3ee57-42ce-4008-b96e-85199f6fd632)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-75666888-t2jlw_calico-system(a4f3ee57-42ce-4008-b96e-85199f6fd632)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9ab569694511abaf723161b2a2fd9ef1e0454490d58d9219c2eebf64a623b9d0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:09:21.599092 containerd[1587]: time="2025-12-16T02:09:21.598497119Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cf67c95b-xtmz8,Uid:24ee6e1b-64f9-47f5-86c3-17009d2e74c9,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a41a093feb0a42da5d1a491b0f75338b65834ab96daee40368f300ce74c44912\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.599389 kubelet[2826]: E1216 02:09:21.599346 2826 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a41a093feb0a42da5d1a491b0f75338b65834ab96daee40368f300ce74c44912\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.599468 kubelet[2826]: E1216 02:09:21.599402 2826 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a41a093feb0a42da5d1a491b0f75338b65834ab96daee40368f300ce74c44912\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" Dec 16 02:09:21.599468 kubelet[2826]: E1216 02:09:21.599422 2826 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a41a093feb0a42da5d1a491b0f75338b65834ab96daee40368f300ce74c44912\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" Dec 16 02:09:21.599695 kubelet[2826]: E1216 02:09:21.599522 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-86cf67c95b-xtmz8_calico-apiserver(24ee6e1b-64f9-47f5-86c3-17009d2e74c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-86cf67c95b-xtmz8_calico-apiserver(24ee6e1b-64f9-47f5-86c3-17009d2e74c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a41a093feb0a42da5d1a491b0f75338b65834ab96daee40368f300ce74c44912\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:09:21.614911 containerd[1587]: time="2025-12-16T02:09:21.614562160Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 02:09:21.621080 containerd[1587]: time="2025-12-16T02:09:21.620578805Z" level=error msg="Failed to destroy network for sandbox \"15c504e2ee43786ee11c55217471052b95f7a9ea0543be08bbebdcb9d71b318b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.621898 containerd[1587]: time="2025-12-16T02:09:21.621798301Z" level=error msg="Failed to destroy network for sandbox \"8d8cff7f8d0dda034b3279a4e6e1562467d7a9fa44440de94efd152db72c8c47\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.624209 containerd[1587]: time="2025-12-16T02:09:21.623931230Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9kfwg,Uid:ee5c1b74-6a31-486f-9498-a4be28b35a8a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"15c504e2ee43786ee11c55217471052b95f7a9ea0543be08bbebdcb9d71b318b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.625467 kubelet[2826]: E1216 02:09:21.625169 2826 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15c504e2ee43786ee11c55217471052b95f7a9ea0543be08bbebdcb9d71b318b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.625839 containerd[1587]: time="2025-12-16T02:09:21.625737055Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-cl8m7,Uid:e4bfcf46-398e-437f-b7f1-81589479eeb2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d8cff7f8d0dda034b3279a4e6e1562467d7a9fa44440de94efd152db72c8c47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.626264 kubelet[2826]: E1216 02:09:21.626178 2826 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d8cff7f8d0dda034b3279a4e6e1562467d7a9fa44440de94efd152db72c8c47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.626426 kubelet[2826]: E1216 02:09:21.626225 2826 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d8cff7f8d0dda034b3279a4e6e1562467d7a9fa44440de94efd152db72c8c47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-cl8m7" Dec 16 02:09:21.626645 kubelet[2826]: E1216 02:09:21.626580 2826 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d8cff7f8d0dda034b3279a4e6e1562467d7a9fa44440de94efd152db72c8c47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-cl8m7" Dec 16 02:09:21.626768 kubelet[2826]: E1216 02:09:21.626743 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-cl8m7_calico-system(e4bfcf46-398e-437f-b7f1-81589479eeb2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-cl8m7_calico-system(e4bfcf46-398e-437f-b7f1-81589479eeb2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d8cff7f8d0dda034b3279a4e6e1562467d7a9fa44440de94efd152db72c8c47\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:09:21.627424 kubelet[2826]: E1216 02:09:21.627107 2826 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15c504e2ee43786ee11c55217471052b95f7a9ea0543be08bbebdcb9d71b318b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-9kfwg" Dec 16 02:09:21.627424 kubelet[2826]: E1216 02:09:21.627142 2826 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15c504e2ee43786ee11c55217471052b95f7a9ea0543be08bbebdcb9d71b318b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-9kfwg" Dec 16 02:09:21.627424 kubelet[2826]: E1216 02:09:21.627195 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-9kfwg_kube-system(ee5c1b74-6a31-486f-9498-a4be28b35a8a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-9kfwg_kube-system(ee5c1b74-6a31-486f-9498-a4be28b35a8a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"15c504e2ee43786ee11c55217471052b95f7a9ea0543be08bbebdcb9d71b318b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-9kfwg" podUID="ee5c1b74-6a31-486f-9498-a4be28b35a8a" Dec 16 02:09:21.651224 containerd[1587]: time="2025-12-16T02:09:21.651151527Z" level=error msg="Failed to destroy network for sandbox \"90edacfe960b7fb590ddb7e2d40f610ecc653f6254cf21a08dfbc94f10197d4a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.653931 containerd[1587]: time="2025-12-16T02:09:21.653869264Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68484cb5f8-k5sjh,Uid:81662439-533e-48ac-902b-f1b932cc9432,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"90edacfe960b7fb590ddb7e2d40f610ecc653f6254cf21a08dfbc94f10197d4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.654686 kubelet[2826]: E1216 02:09:21.654404 2826 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90edacfe960b7fb590ddb7e2d40f610ecc653f6254cf21a08dfbc94f10197d4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:21.654686 kubelet[2826]: E1216 02:09:21.654484 2826 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90edacfe960b7fb590ddb7e2d40f610ecc653f6254cf21a08dfbc94f10197d4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-68484cb5f8-k5sjh" Dec 16 02:09:21.654686 kubelet[2826]: E1216 02:09:21.654548 2826 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90edacfe960b7fb590ddb7e2d40f610ecc653f6254cf21a08dfbc94f10197d4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-68484cb5f8-k5sjh" Dec 16 02:09:21.654874 kubelet[2826]: E1216 02:09:21.654628 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-68484cb5f8-k5sjh_calico-system(81662439-533e-48ac-902b-f1b932cc9432)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-68484cb5f8-k5sjh_calico-system(81662439-533e-48ac-902b-f1b932cc9432)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"90edacfe960b7fb590ddb7e2d40f610ecc653f6254cf21a08dfbc94f10197d4a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-68484cb5f8-k5sjh" podUID="81662439-533e-48ac-902b-f1b932cc9432" Dec 16 02:09:22.455803 systemd[1]: Created slice kubepods-besteffort-pod919dd2b2_2bc2_4394_9fed_3f9f47f938e5.slice - libcontainer container kubepods-besteffort-pod919dd2b2_2bc2_4394_9fed_3f9f47f938e5.slice. Dec 16 02:09:22.461879 containerd[1587]: time="2025-12-16T02:09:22.461831665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rp5zk,Uid:919dd2b2-2bc2-4394-9fed-3f9f47f938e5,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:22.516048 containerd[1587]: time="2025-12-16T02:09:22.514103696Z" level=error msg="Failed to destroy network for sandbox \"0c1c0eeded87b7b30d50215c4c089aa5c09231b69cb2e6ee7ca86dc6ee3f8529\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:22.516646 systemd[1]: run-netns-cni\x2d8e4392d9\x2d50a3\x2d8e26\x2db5ab\x2da997c17a455b.mount: Deactivated successfully. Dec 16 02:09:22.518862 containerd[1587]: time="2025-12-16T02:09:22.518770670Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rp5zk,Uid:919dd2b2-2bc2-4394-9fed-3f9f47f938e5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c1c0eeded87b7b30d50215c4c089aa5c09231b69cb2e6ee7ca86dc6ee3f8529\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:22.519327 kubelet[2826]: E1216 02:09:22.519265 2826 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c1c0eeded87b7b30d50215c4c089aa5c09231b69cb2e6ee7ca86dc6ee3f8529\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 02:09:22.519327 kubelet[2826]: E1216 02:09:22.519325 2826 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c1c0eeded87b7b30d50215c4c089aa5c09231b69cb2e6ee7ca86dc6ee3f8529\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rp5zk" Dec 16 02:09:22.519941 kubelet[2826]: E1216 02:09:22.519346 2826 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c1c0eeded87b7b30d50215c4c089aa5c09231b69cb2e6ee7ca86dc6ee3f8529\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rp5zk" Dec 16 02:09:22.519941 kubelet[2826]: E1216 02:09:22.519397 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0c1c0eeded87b7b30d50215c4c089aa5c09231b69cb2e6ee7ca86dc6ee3f8529\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:26.328850 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3232060715.mount: Deactivated successfully. Dec 16 02:09:26.354965 containerd[1587]: time="2025-12-16T02:09:26.354856759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:26.356559 containerd[1587]: time="2025-12-16T02:09:26.356216393Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Dec 16 02:09:26.357670 containerd[1587]: time="2025-12-16T02:09:26.357616426Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:26.360579 containerd[1587]: time="2025-12-16T02:09:26.360502288Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 02:09:26.361951 containerd[1587]: time="2025-12-16T02:09:26.361901600Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 4.747082934s" Dec 16 02:09:26.362151 containerd[1587]: time="2025-12-16T02:09:26.362111513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 16 02:09:26.412842 containerd[1587]: time="2025-12-16T02:09:26.412778995Z" level=info msg="CreateContainer within sandbox \"a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 02:09:26.425469 containerd[1587]: time="2025-12-16T02:09:26.425430646Z" level=info msg="Container c9ff9fb273001a6c6fbc44241391c2183128b518f8eef581e81d2da5ca6a0cec: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:09:26.430131 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2199071608.mount: Deactivated successfully. Dec 16 02:09:26.449350 containerd[1587]: time="2025-12-16T02:09:26.449247278Z" level=info msg="CreateContainer within sandbox \"a837472edd507183135d4b0ca7706d4288265433f75ceaed2b40d73399990d2b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c9ff9fb273001a6c6fbc44241391c2183128b518f8eef581e81d2da5ca6a0cec\"" Dec 16 02:09:26.450996 containerd[1587]: time="2025-12-16T02:09:26.450312322Z" level=info msg="StartContainer for \"c9ff9fb273001a6c6fbc44241391c2183128b518f8eef581e81d2da5ca6a0cec\"" Dec 16 02:09:26.454350 containerd[1587]: time="2025-12-16T02:09:26.454296507Z" level=info msg="connecting to shim c9ff9fb273001a6c6fbc44241391c2183128b518f8eef581e81d2da5ca6a0cec" address="unix:///run/containerd/s/495e23fe1e89345cd847743a52e6ce3721eef37a4b3d90c1569e30f042bdfdd4" protocol=ttrpc version=3 Dec 16 02:09:26.503297 systemd[1]: Started cri-containerd-c9ff9fb273001a6c6fbc44241391c2183128b518f8eef581e81d2da5ca6a0cec.scope - libcontainer container c9ff9fb273001a6c6fbc44241391c2183128b518f8eef581e81d2da5ca6a0cec. Dec 16 02:09:26.567271 kernel: audit: type=1334 audit(1765850966.563:563): prog-id=172 op=LOAD Dec 16 02:09:26.567479 kernel: audit: type=1300 audit(1765850966.563:563): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3336 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:26.563000 audit: BPF prog-id=172 op=LOAD Dec 16 02:09:26.563000 audit[3851]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3336 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:26.563000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666639666232373330303161366336666263343432343133393163 Dec 16 02:09:26.570873 kernel: audit: type=1327 audit(1765850966.563:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666639666232373330303161366336666263343432343133393163 Dec 16 02:09:26.570986 kernel: audit: type=1334 audit(1765850966.567:564): prog-id=173 op=LOAD Dec 16 02:09:26.567000 audit: BPF prog-id=173 op=LOAD Dec 16 02:09:26.571455 kernel: audit: type=1300 audit(1765850966.567:564): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3336 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:26.567000 audit[3851]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3336 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:26.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666639666232373330303161366336666263343432343133393163 Dec 16 02:09:26.575651 kernel: audit: type=1327 audit(1765850966.567:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666639666232373330303161366336666263343432343133393163 Dec 16 02:09:26.575764 kernel: audit: type=1334 audit(1765850966.567:565): prog-id=173 op=UNLOAD Dec 16 02:09:26.567000 audit: BPF prog-id=173 op=UNLOAD Dec 16 02:09:26.578475 kernel: audit: type=1300 audit(1765850966.567:565): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:26.578579 kernel: audit: type=1327 audit(1765850966.567:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666639666232373330303161366336666263343432343133393163 Dec 16 02:09:26.567000 audit[3851]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:26.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666639666232373330303161366336666263343432343133393163 Dec 16 02:09:26.567000 audit: BPF prog-id=172 op=UNLOAD Dec 16 02:09:26.567000 audit[3851]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3336 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:26.581151 kernel: audit: type=1334 audit(1765850966.567:566): prog-id=172 op=UNLOAD Dec 16 02:09:26.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666639666232373330303161366336666263343432343133393163 Dec 16 02:09:26.567000 audit: BPF prog-id=174 op=LOAD Dec 16 02:09:26.567000 audit[3851]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3336 pid=3851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:26.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666639666232373330303161366336666263343432343133393163 Dec 16 02:09:26.613184 containerd[1587]: time="2025-12-16T02:09:26.613067921Z" level=info msg="StartContainer for \"c9ff9fb273001a6c6fbc44241391c2183128b518f8eef581e81d2da5ca6a0cec\" returns successfully" Dec 16 02:09:26.659149 kubelet[2826]: I1216 02:09:26.658399 2826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-cztls" podStartSLOduration=1.667127186 podStartE2EDuration="14.658380224s" podCreationTimestamp="2025-12-16 02:09:12 +0000 UTC" firstStartedPulling="2025-12-16 02:09:13.385205229 +0000 UTC m=+28.102470271" lastFinishedPulling="2025-12-16 02:09:26.376458307 +0000 UTC m=+41.093723309" observedRunningTime="2025-12-16 02:09:26.654858064 +0000 UTC m=+41.372123066" watchObservedRunningTime="2025-12-16 02:09:26.658380224 +0000 UTC m=+41.375645226" Dec 16 02:09:26.777056 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 02:09:26.777169 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 02:09:27.006400 kubelet[2826]: I1216 02:09:27.005995 2826 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/81662439-533e-48ac-902b-f1b932cc9432-whisker-backend-key-pair\") pod \"81662439-533e-48ac-902b-f1b932cc9432\" (UID: \"81662439-533e-48ac-902b-f1b932cc9432\") " Dec 16 02:09:27.006400 kubelet[2826]: I1216 02:09:27.006096 2826 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smhcj\" (UniqueName: \"kubernetes.io/projected/81662439-533e-48ac-902b-f1b932cc9432-kube-api-access-smhcj\") pod \"81662439-533e-48ac-902b-f1b932cc9432\" (UID: \"81662439-533e-48ac-902b-f1b932cc9432\") " Dec 16 02:09:27.006400 kubelet[2826]: I1216 02:09:27.006117 2826 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81662439-533e-48ac-902b-f1b932cc9432-whisker-ca-bundle\") pod \"81662439-533e-48ac-902b-f1b932cc9432\" (UID: \"81662439-533e-48ac-902b-f1b932cc9432\") " Dec 16 02:09:27.006717 kubelet[2826]: I1216 02:09:27.006616 2826 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81662439-533e-48ac-902b-f1b932cc9432-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "81662439-533e-48ac-902b-f1b932cc9432" (UID: "81662439-533e-48ac-902b-f1b932cc9432"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 02:09:27.012538 kubelet[2826]: I1216 02:09:27.012472 2826 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81662439-533e-48ac-902b-f1b932cc9432-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "81662439-533e-48ac-902b-f1b932cc9432" (UID: "81662439-533e-48ac-902b-f1b932cc9432"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 02:09:27.012984 kubelet[2826]: I1216 02:09:27.012860 2826 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81662439-533e-48ac-902b-f1b932cc9432-kube-api-access-smhcj" (OuterVolumeSpecName: "kube-api-access-smhcj") pod "81662439-533e-48ac-902b-f1b932cc9432" (UID: "81662439-533e-48ac-902b-f1b932cc9432"). InnerVolumeSpecName "kube-api-access-smhcj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 02:09:27.106622 kubelet[2826]: I1216 02:09:27.106494 2826 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/81662439-533e-48ac-902b-f1b932cc9432-whisker-backend-key-pair\") on node \"ci-4547-0-0-9-be0981937a\" DevicePath \"\"" Dec 16 02:09:27.106622 kubelet[2826]: I1216 02:09:27.106574 2826 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-smhcj\" (UniqueName: \"kubernetes.io/projected/81662439-533e-48ac-902b-f1b932cc9432-kube-api-access-smhcj\") on node \"ci-4547-0-0-9-be0981937a\" DevicePath \"\"" Dec 16 02:09:27.106622 kubelet[2826]: I1216 02:09:27.106588 2826 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81662439-533e-48ac-902b-f1b932cc9432-whisker-ca-bundle\") on node \"ci-4547-0-0-9-be0981937a\" DevicePath \"\"" Dec 16 02:09:27.330525 systemd[1]: var-lib-kubelet-pods-81662439\x2d533e\x2d48ac\x2d902b\x2df1b932cc9432-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dsmhcj.mount: Deactivated successfully. Dec 16 02:09:27.330635 systemd[1]: var-lib-kubelet-pods-81662439\x2d533e\x2d48ac\x2d902b\x2df1b932cc9432-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 02:09:27.457627 systemd[1]: Removed slice kubepods-besteffort-pod81662439_533e_48ac_902b_f1b932cc9432.slice - libcontainer container kubepods-besteffort-pod81662439_533e_48ac_902b_f1b932cc9432.slice. Dec 16 02:09:27.636079 kubelet[2826]: I1216 02:09:27.635846 2826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 02:09:27.733367 systemd[1]: Created slice kubepods-besteffort-pod76743683_d50e_4bc9_aceb_a84e73d5c7be.slice - libcontainer container kubepods-besteffort-pod76743683_d50e_4bc9_aceb_a84e73d5c7be.slice. Dec 16 02:09:27.813333 kubelet[2826]: I1216 02:09:27.813176 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfvp\" (UniqueName: \"kubernetes.io/projected/76743683-d50e-4bc9-aceb-a84e73d5c7be-kube-api-access-wlfvp\") pod \"whisker-b648c4bbd-x7xsf\" (UID: \"76743683-d50e-4bc9-aceb-a84e73d5c7be\") " pod="calico-system/whisker-b648c4bbd-x7xsf" Dec 16 02:09:27.813333 kubelet[2826]: I1216 02:09:27.813321 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/76743683-d50e-4bc9-aceb-a84e73d5c7be-whisker-backend-key-pair\") pod \"whisker-b648c4bbd-x7xsf\" (UID: \"76743683-d50e-4bc9-aceb-a84e73d5c7be\") " pod="calico-system/whisker-b648c4bbd-x7xsf" Dec 16 02:09:27.813827 kubelet[2826]: I1216 02:09:27.813392 2826 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76743683-d50e-4bc9-aceb-a84e73d5c7be-whisker-ca-bundle\") pod \"whisker-b648c4bbd-x7xsf\" (UID: \"76743683-d50e-4bc9-aceb-a84e73d5c7be\") " pod="calico-system/whisker-b648c4bbd-x7xsf" Dec 16 02:09:28.042048 containerd[1587]: time="2025-12-16T02:09:28.041970820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-b648c4bbd-x7xsf,Uid:76743683-d50e-4bc9-aceb-a84e73d5c7be,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:28.287654 systemd-networkd[1480]: calif7a2f307ff8: Link UP Dec 16 02:09:28.290063 systemd-networkd[1480]: calif7a2f307ff8: Gained carrier Dec 16 02:09:28.325846 containerd[1587]: 2025-12-16 02:09:28.072 [INFO][3917] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 02:09:28.325846 containerd[1587]: 2025-12-16 02:09:28.143 [INFO][3917] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0 whisker-b648c4bbd- calico-system 76743683-d50e-4bc9-aceb-a84e73d5c7be 883 0 2025-12-16 02:09:27 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:b648c4bbd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-9-be0981937a whisker-b648c4bbd-x7xsf eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif7a2f307ff8 [] [] }} ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Namespace="calico-system" Pod="whisker-b648c4bbd-x7xsf" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-" Dec 16 02:09:28.325846 containerd[1587]: 2025-12-16 02:09:28.143 [INFO][3917] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Namespace="calico-system" Pod="whisker-b648c4bbd-x7xsf" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" Dec 16 02:09:28.325846 containerd[1587]: 2025-12-16 02:09:28.204 [INFO][3928] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" HandleID="k8s-pod-network.d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Workload="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" Dec 16 02:09:28.327420 containerd[1587]: 2025-12-16 02:09:28.205 [INFO][3928] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" HandleID="k8s-pod-network.d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Workload="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400028f950), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-9-be0981937a", "pod":"whisker-b648c4bbd-x7xsf", "timestamp":"2025-12-16 02:09:28.204448056 +0000 UTC"}, Hostname:"ci-4547-0-0-9-be0981937a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 02:09:28.327420 containerd[1587]: 2025-12-16 02:09:28.205 [INFO][3928] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 02:09:28.327420 containerd[1587]: 2025-12-16 02:09:28.205 [INFO][3928] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 02:09:28.327420 containerd[1587]: 2025-12-16 02:09:28.205 [INFO][3928] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-9-be0981937a' Dec 16 02:09:28.327420 containerd[1587]: 2025-12-16 02:09:28.217 [INFO][3928] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:28.327420 containerd[1587]: 2025-12-16 02:09:28.231 [INFO][3928] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:28.327420 containerd[1587]: 2025-12-16 02:09:28.240 [INFO][3928] ipam/ipam.go 511: Trying affinity for 192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:28.327420 containerd[1587]: 2025-12-16 02:09:28.245 [INFO][3928] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:28.327420 containerd[1587]: 2025-12-16 02:09:28.248 [INFO][3928] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:28.328282 containerd[1587]: 2025-12-16 02:09:28.248 [INFO][3928] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.192/26 handle="k8s-pod-network.d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:28.328282 containerd[1587]: 2025-12-16 02:09:28.259 [INFO][3928] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801 Dec 16 02:09:28.328282 containerd[1587]: 2025-12-16 02:09:28.264 [INFO][3928] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.192/26 handle="k8s-pod-network.d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:28.328282 containerd[1587]: 2025-12-16 02:09:28.271 [INFO][3928] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.193/26] block=192.168.115.192/26 handle="k8s-pod-network.d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:28.328282 containerd[1587]: 2025-12-16 02:09:28.271 [INFO][3928] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.193/26] handle="k8s-pod-network.d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:28.328282 containerd[1587]: 2025-12-16 02:09:28.272 [INFO][3928] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 02:09:28.328282 containerd[1587]: 2025-12-16 02:09:28.272 [INFO][3928] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.193/26] IPv6=[] ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" HandleID="k8s-pod-network.d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Workload="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" Dec 16 02:09:28.328838 containerd[1587]: 2025-12-16 02:09:28.275 [INFO][3917] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Namespace="calico-system" Pod="whisker-b648c4bbd-x7xsf" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0", GenerateName:"whisker-b648c4bbd-", Namespace:"calico-system", SelfLink:"", UID:"76743683-d50e-4bc9-aceb-a84e73d5c7be", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"b648c4bbd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"", Pod:"whisker-b648c4bbd-x7xsf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.115.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif7a2f307ff8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:28.328838 containerd[1587]: 2025-12-16 02:09:28.276 [INFO][3917] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.193/32] ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Namespace="calico-system" Pod="whisker-b648c4bbd-x7xsf" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" Dec 16 02:09:28.328925 containerd[1587]: 2025-12-16 02:09:28.276 [INFO][3917] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif7a2f307ff8 ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Namespace="calico-system" Pod="whisker-b648c4bbd-x7xsf" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" Dec 16 02:09:28.328925 containerd[1587]: 2025-12-16 02:09:28.289 [INFO][3917] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Namespace="calico-system" Pod="whisker-b648c4bbd-x7xsf" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" Dec 16 02:09:28.328969 containerd[1587]: 2025-12-16 02:09:28.292 [INFO][3917] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Namespace="calico-system" Pod="whisker-b648c4bbd-x7xsf" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0", GenerateName:"whisker-b648c4bbd-", Namespace:"calico-system", SelfLink:"", UID:"76743683-d50e-4bc9-aceb-a84e73d5c7be", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"b648c4bbd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801", Pod:"whisker-b648c4bbd-x7xsf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.115.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif7a2f307ff8", MAC:"ce:22:25:c8:b1:e4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:28.329016 containerd[1587]: 2025-12-16 02:09:28.316 [INFO][3917] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" Namespace="calico-system" Pod="whisker-b648c4bbd-x7xsf" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-whisker--b648c4bbd--x7xsf-eth0" Dec 16 02:09:28.413733 containerd[1587]: time="2025-12-16T02:09:28.413168507Z" level=info msg="connecting to shim d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801" address="unix:///run/containerd/s/009aa60feb81221d6ea609e7d04983cc08e9371e5db1aef0904a913f62284619" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:28.505696 systemd[1]: Started cri-containerd-d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801.scope - libcontainer container d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801. Dec 16 02:09:28.526000 audit: BPF prog-id=175 op=LOAD Dec 16 02:09:28.530000 audit: BPF prog-id=176 op=LOAD Dec 16 02:09:28.530000 audit[4045]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4034 pid=4045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:28.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373834373333343661396165646437346564373938653535663631 Dec 16 02:09:28.530000 audit: BPF prog-id=176 op=UNLOAD Dec 16 02:09:28.530000 audit[4045]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4034 pid=4045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:28.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373834373333343661396165646437346564373938653535663631 Dec 16 02:09:28.530000 audit: BPF prog-id=177 op=LOAD Dec 16 02:09:28.530000 audit[4045]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4034 pid=4045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:28.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373834373333343661396165646437346564373938653535663631 Dec 16 02:09:28.530000 audit: BPF prog-id=178 op=LOAD Dec 16 02:09:28.530000 audit[4045]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4034 pid=4045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:28.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373834373333343661396165646437346564373938653535663631 Dec 16 02:09:28.530000 audit: BPF prog-id=178 op=UNLOAD Dec 16 02:09:28.530000 audit[4045]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4034 pid=4045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:28.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373834373333343661396165646437346564373938653535663631 Dec 16 02:09:28.530000 audit: BPF prog-id=177 op=UNLOAD Dec 16 02:09:28.530000 audit[4045]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4034 pid=4045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:28.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373834373333343661396165646437346564373938653535663631 Dec 16 02:09:28.530000 audit: BPF prog-id=179 op=LOAD Dec 16 02:09:28.530000 audit[4045]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4034 pid=4045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:28.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432373834373333343661396165646437346564373938653535663631 Dec 16 02:09:28.619977 containerd[1587]: time="2025-12-16T02:09:28.618830442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-b648c4bbd-x7xsf,Uid:76743683-d50e-4bc9-aceb-a84e73d5c7be,Namespace:calico-system,Attempt:0,} returns sandbox id \"d278473346a9aedd74ed798e55f61452d8e69a27ee7f1277afb647e842dc1801\"" Dec 16 02:09:28.623539 containerd[1587]: time="2025-12-16T02:09:28.623496114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 02:09:28.977854 containerd[1587]: time="2025-12-16T02:09:28.977522111Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:28.981448 containerd[1587]: time="2025-12-16T02:09:28.981394605Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 02:09:28.981742 containerd[1587]: time="2025-12-16T02:09:28.981485362Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:28.981918 kubelet[2826]: E1216 02:09:28.981881 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:09:28.982254 kubelet[2826]: E1216 02:09:28.981933 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:09:28.982254 kubelet[2826]: E1216 02:09:28.982074 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-b648c4bbd-x7xsf_calico-system(76743683-d50e-4bc9-aceb-a84e73d5c7be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:28.983564 containerd[1587]: time="2025-12-16T02:09:28.983538586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 02:09:29.319322 containerd[1587]: time="2025-12-16T02:09:29.319175082Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:29.321125 containerd[1587]: time="2025-12-16T02:09:29.320976198Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 02:09:29.321125 containerd[1587]: time="2025-12-16T02:09:29.321046236Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:29.321530 kubelet[2826]: E1216 02:09:29.321490 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 02:09:29.321652 kubelet[2826]: E1216 02:09:29.321631 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 02:09:29.321887 kubelet[2826]: E1216 02:09:29.321859 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-b648c4bbd-x7xsf_calico-system(76743683-d50e-4bc9-aceb-a84e73d5c7be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:29.322046 kubelet[2826]: E1216 02:09:29.321996 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:09:29.451671 kubelet[2826]: I1216 02:09:29.451449 2826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81662439-533e-48ac-902b-f1b932cc9432" path="/var/lib/kubelet/pods/81662439-533e-48ac-902b-f1b932cc9432/volumes" Dec 16 02:09:29.645419 kubelet[2826]: E1216 02:09:29.645284 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:09:29.679000 audit[4095]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=4095 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:29.679000 audit[4095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffff8c6f430 a2=0 a3=1 items=0 ppid=2986 pid=4095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:29.679000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:29.688000 audit[4095]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=4095 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:29.688000 audit[4095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff8c6f430 a2=0 a3=1 items=0 ppid=2986 pid=4095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:29.688000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:30.241139 systemd-networkd[1480]: calif7a2f307ff8: Gained IPv6LL Dec 16 02:09:30.879939 kubelet[2826]: I1216 02:09:30.879423 2826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 02:09:33.049497 kubelet[2826]: I1216 02:09:33.049348 2826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 02:09:33.090000 audit[4216]: NETFILTER_CFG table=filter:119 family=2 entries=21 op=nft_register_rule pid=4216 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:33.092406 kernel: kauditd_printk_skb: 33 callbacks suppressed Dec 16 02:09:33.092468 kernel: audit: type=1325 audit(1765850973.090:578): table=filter:119 family=2 entries=21 op=nft_register_rule pid=4216 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:33.090000 audit[4216]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd3bb97e0 a2=0 a3=1 items=0 ppid=2986 pid=4216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.096069 kernel: audit: type=1300 audit(1765850973.090:578): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd3bb97e0 a2=0 a3=1 items=0 ppid=2986 pid=4216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.090000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:33.097787 kernel: audit: type=1327 audit(1765850973.090:578): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:33.097853 kernel: audit: type=1325 audit(1765850973.096:579): table=nat:120 family=2 entries=19 op=nft_register_chain pid=4216 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:33.096000 audit[4216]: NETFILTER_CFG table=nat:120 family=2 entries=19 op=nft_register_chain pid=4216 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:33.096000 audit[4216]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffd3bb97e0 a2=0 a3=1 items=0 ppid=2986 pid=4216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.101392 kernel: audit: type=1300 audit(1765850973.096:579): arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffd3bb97e0 a2=0 a3=1 items=0 ppid=2986 pid=4216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.101502 kernel: audit: type=1327 audit(1765850973.096:579): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:33.096000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:33.463208 containerd[1587]: time="2025-12-16T02:09:33.462390450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cf67c95b-68pm7,Uid:4f109e8a-b6cd-4daa-a636-a987203ce9dc,Namespace:calico-apiserver,Attempt:0,}" Dec 16 02:09:33.624969 systemd-networkd[1480]: calia3e5330b1b4: Link UP Dec 16 02:09:33.626980 systemd-networkd[1480]: calia3e5330b1b4: Gained carrier Dec 16 02:09:33.661083 containerd[1587]: 2025-12-16 02:09:33.499 [INFO][4219] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 02:09:33.661083 containerd[1587]: 2025-12-16 02:09:33.518 [INFO][4219] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0 calico-apiserver-86cf67c95b- calico-apiserver 4f109e8a-b6cd-4daa-a636-a987203ce9dc 814 0 2025-12-16 02:09:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:86cf67c95b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-9-be0981937a calico-apiserver-86cf67c95b-68pm7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia3e5330b1b4 [] [] }} ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-68pm7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-" Dec 16 02:09:33.661083 containerd[1587]: 2025-12-16 02:09:33.518 [INFO][4219] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-68pm7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" Dec 16 02:09:33.661083 containerd[1587]: 2025-12-16 02:09:33.550 [INFO][4232] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" HandleID="k8s-pod-network.53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Workload="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" Dec 16 02:09:33.661319 containerd[1587]: 2025-12-16 02:09:33.550 [INFO][4232] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" HandleID="k8s-pod-network.53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Workload="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b000), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-9-be0981937a", "pod":"calico-apiserver-86cf67c95b-68pm7", "timestamp":"2025-12-16 02:09:33.550591294 +0000 UTC"}, Hostname:"ci-4547-0-0-9-be0981937a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 02:09:33.661319 containerd[1587]: 2025-12-16 02:09:33.550 [INFO][4232] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 02:09:33.661319 containerd[1587]: 2025-12-16 02:09:33.551 [INFO][4232] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 02:09:33.661319 containerd[1587]: 2025-12-16 02:09:33.551 [INFO][4232] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-9-be0981937a' Dec 16 02:09:33.661319 containerd[1587]: 2025-12-16 02:09:33.562 [INFO][4232] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:33.661319 containerd[1587]: 2025-12-16 02:09:33.569 [INFO][4232] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:33.661319 containerd[1587]: 2025-12-16 02:09:33.577 [INFO][4232] ipam/ipam.go 511: Trying affinity for 192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:33.661319 containerd[1587]: 2025-12-16 02:09:33.580 [INFO][4232] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:33.661319 containerd[1587]: 2025-12-16 02:09:33.583 [INFO][4232] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:33.661505 containerd[1587]: 2025-12-16 02:09:33.584 [INFO][4232] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.192/26 handle="k8s-pod-network.53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:33.661505 containerd[1587]: 2025-12-16 02:09:33.586 [INFO][4232] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40 Dec 16 02:09:33.661505 containerd[1587]: 2025-12-16 02:09:33.596 [INFO][4232] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.192/26 handle="k8s-pod-network.53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:33.661505 containerd[1587]: 2025-12-16 02:09:33.613 [INFO][4232] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.194/26] block=192.168.115.192/26 handle="k8s-pod-network.53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:33.661505 containerd[1587]: 2025-12-16 02:09:33.613 [INFO][4232] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.194/26] handle="k8s-pod-network.53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:33.661505 containerd[1587]: 2025-12-16 02:09:33.613 [INFO][4232] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 02:09:33.661505 containerd[1587]: 2025-12-16 02:09:33.613 [INFO][4232] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.194/26] IPv6=[] ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" HandleID="k8s-pod-network.53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Workload="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" Dec 16 02:09:33.661672 containerd[1587]: 2025-12-16 02:09:33.617 [INFO][4219] cni-plugin/k8s.go 418: Populated endpoint ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-68pm7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0", GenerateName:"calico-apiserver-86cf67c95b-", Namespace:"calico-apiserver", SelfLink:"", UID:"4f109e8a-b6cd-4daa-a636-a987203ce9dc", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86cf67c95b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"", Pod:"calico-apiserver-86cf67c95b-68pm7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.115.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia3e5330b1b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:33.661726 containerd[1587]: 2025-12-16 02:09:33.617 [INFO][4219] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.194/32] ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-68pm7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" Dec 16 02:09:33.661726 containerd[1587]: 2025-12-16 02:09:33.617 [INFO][4219] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia3e5330b1b4 ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-68pm7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" Dec 16 02:09:33.661726 containerd[1587]: 2025-12-16 02:09:33.624 [INFO][4219] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-68pm7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" Dec 16 02:09:33.661783 containerd[1587]: 2025-12-16 02:09:33.625 [INFO][4219] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-68pm7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0", GenerateName:"calico-apiserver-86cf67c95b-", Namespace:"calico-apiserver", SelfLink:"", UID:"4f109e8a-b6cd-4daa-a636-a987203ce9dc", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86cf67c95b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40", Pod:"calico-apiserver-86cf67c95b-68pm7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.115.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia3e5330b1b4", MAC:"76:fb:0b:cc:4e:13", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:33.661831 containerd[1587]: 2025-12-16 02:09:33.651 [INFO][4219] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-68pm7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--68pm7-eth0" Dec 16 02:09:33.702336 containerd[1587]: time="2025-12-16T02:09:33.702265336Z" level=info msg="connecting to shim 53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40" address="unix:///run/containerd/s/b3cf6e7c2e02359d8e3c569834f292f93adcc70ef6d6118d1a2849e76db8f7dd" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:33.748353 systemd[1]: Started cri-containerd-53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40.scope - libcontainer container 53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40. Dec 16 02:09:33.766000 audit: BPF prog-id=180 op=LOAD Dec 16 02:09:33.767000 audit: BPF prog-id=181 op=LOAD Dec 16 02:09:33.768688 kernel: audit: type=1334 audit(1765850973.766:580): prog-id=180 op=LOAD Dec 16 02:09:33.768758 kernel: audit: type=1334 audit(1765850973.767:581): prog-id=181 op=LOAD Dec 16 02:09:33.768784 kernel: audit: type=1300 audit(1765850973.767:581): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4255 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.767000 audit[4267]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4255 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.771367 kernel: audit: type=1327 audit(1765850973.767:581): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533646432653338623164396330633337643530336134663764633430 Dec 16 02:09:33.767000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533646432653338623164396330633337643530336134663764633430 Dec 16 02:09:33.767000 audit: BPF prog-id=181 op=UNLOAD Dec 16 02:09:33.767000 audit[4267]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4255 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.767000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533646432653338623164396330633337643530336134663764633430 Dec 16 02:09:33.767000 audit: BPF prog-id=182 op=LOAD Dec 16 02:09:33.767000 audit[4267]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4255 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.767000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533646432653338623164396330633337643530336134663764633430 Dec 16 02:09:33.770000 audit: BPF prog-id=183 op=LOAD Dec 16 02:09:33.770000 audit[4267]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4255 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.770000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533646432653338623164396330633337643530336134663764633430 Dec 16 02:09:33.773000 audit: BPF prog-id=183 op=UNLOAD Dec 16 02:09:33.773000 audit[4267]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4255 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.773000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533646432653338623164396330633337643530336134663764633430 Dec 16 02:09:33.773000 audit: BPF prog-id=182 op=UNLOAD Dec 16 02:09:33.773000 audit[4267]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4255 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.773000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533646432653338623164396330633337643530336134663764633430 Dec 16 02:09:33.773000 audit: BPF prog-id=184 op=LOAD Dec 16 02:09:33.773000 audit[4267]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4255 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:33.773000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533646432653338623164396330633337643530336134663764633430 Dec 16 02:09:33.818689 containerd[1587]: time="2025-12-16T02:09:33.818554265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cf67c95b-68pm7,Uid:4f109e8a-b6cd-4daa-a636-a987203ce9dc,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"53dd2e38b1d9c0c37d503a4f7dc40af47af05f5a487d602426e67c048ab83a40\"" Dec 16 02:09:33.822346 containerd[1587]: time="2025-12-16T02:09:33.822110180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 02:09:34.082000 audit: BPF prog-id=185 op=LOAD Dec 16 02:09:34.082000 audit[4337]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe10fa868 a2=98 a3=ffffe10fa858 items=0 ppid=4293 pid=4337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.082000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 02:09:34.082000 audit: BPF prog-id=185 op=UNLOAD Dec 16 02:09:34.082000 audit[4337]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe10fa838 a3=0 items=0 ppid=4293 pid=4337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.082000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 02:09:34.083000 audit: BPF prog-id=186 op=LOAD Dec 16 02:09:34.083000 audit[4337]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe10fa718 a2=74 a3=95 items=0 ppid=4293 pid=4337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.083000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 02:09:34.083000 audit: BPF prog-id=186 op=UNLOAD Dec 16 02:09:34.083000 audit[4337]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4293 pid=4337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.083000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 02:09:34.084000 audit: BPF prog-id=187 op=LOAD Dec 16 02:09:34.084000 audit[4337]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe10fa748 a2=40 a3=ffffe10fa778 items=0 ppid=4293 pid=4337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.084000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 02:09:34.084000 audit: BPF prog-id=187 op=UNLOAD Dec 16 02:09:34.084000 audit[4337]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffe10fa778 items=0 ppid=4293 pid=4337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.084000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 02:09:34.089000 audit: BPF prog-id=188 op=LOAD Dec 16 02:09:34.089000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd3261568 a2=98 a3=ffffd3261558 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.089000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.089000 audit: BPF prog-id=188 op=UNLOAD Dec 16 02:09:34.089000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd3261538 a3=0 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.089000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.092000 audit: BPF prog-id=189 op=LOAD Dec 16 02:09:34.092000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd32611f8 a2=74 a3=95 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.092000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.092000 audit: BPF prog-id=189 op=UNLOAD Dec 16 02:09:34.092000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.092000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.092000 audit: BPF prog-id=190 op=LOAD Dec 16 02:09:34.092000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd3261258 a2=94 a3=2 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.092000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.092000 audit: BPF prog-id=190 op=UNLOAD Dec 16 02:09:34.092000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.092000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.171742 containerd[1587]: time="2025-12-16T02:09:34.171610855Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:34.174581 containerd[1587]: time="2025-12-16T02:09:34.174468626Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 02:09:34.174947 containerd[1587]: time="2025-12-16T02:09:34.174767383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:34.175419 kubelet[2826]: E1216 02:09:34.175360 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:09:34.176591 kubelet[2826]: E1216 02:09:34.176043 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:09:34.176591 kubelet[2826]: E1216 02:09:34.176181 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-86cf67c95b-68pm7_calico-apiserver(4f109e8a-b6cd-4daa-a636-a987203ce9dc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:34.177051 kubelet[2826]: E1216 02:09:34.176970 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:09:34.328000 audit: BPF prog-id=191 op=LOAD Dec 16 02:09:34.328000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd3261218 a2=40 a3=ffffd3261248 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.328000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.328000 audit: BPF prog-id=191 op=UNLOAD Dec 16 02:09:34.328000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffd3261248 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.328000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.338000 audit: BPF prog-id=192 op=LOAD Dec 16 02:09:34.338000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd3261228 a2=94 a3=4 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.338000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.338000 audit: BPF prog-id=192 op=UNLOAD Dec 16 02:09:34.338000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.338000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.339000 audit: BPF prog-id=193 op=LOAD Dec 16 02:09:34.339000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd3261068 a2=94 a3=5 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.339000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.339000 audit: BPF prog-id=193 op=UNLOAD Dec 16 02:09:34.339000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.339000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.339000 audit: BPF prog-id=194 op=LOAD Dec 16 02:09:34.339000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd3261298 a2=94 a3=6 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.339000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.340000 audit: BPF prog-id=194 op=UNLOAD Dec 16 02:09:34.340000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.340000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.340000 audit: BPF prog-id=195 op=LOAD Dec 16 02:09:34.340000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd3260a68 a2=94 a3=83 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.340000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.340000 audit: BPF prog-id=196 op=LOAD Dec 16 02:09:34.340000 audit[4338]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffd3260828 a2=94 a3=2 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.340000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.340000 audit: BPF prog-id=196 op=UNLOAD Dec 16 02:09:34.340000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.340000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.341000 audit: BPF prog-id=195 op=UNLOAD Dec 16 02:09:34.341000 audit[4338]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=3c6f5620 a3=3c6e8b00 items=0 ppid=4293 pid=4338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.341000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 02:09:34.353000 audit: BPF prog-id=197 op=LOAD Dec 16 02:09:34.353000 audit[4350]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc559afc8 a2=98 a3=ffffc559afb8 items=0 ppid=4293 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.353000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 02:09:34.353000 audit: BPF prog-id=197 op=UNLOAD Dec 16 02:09:34.353000 audit[4350]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc559af98 a3=0 items=0 ppid=4293 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.353000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 02:09:34.353000 audit: BPF prog-id=198 op=LOAD Dec 16 02:09:34.353000 audit[4350]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc559ae78 a2=74 a3=95 items=0 ppid=4293 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.353000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 02:09:34.353000 audit: BPF prog-id=198 op=UNLOAD Dec 16 02:09:34.353000 audit[4350]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4293 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.353000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 02:09:34.353000 audit: BPF prog-id=199 op=LOAD Dec 16 02:09:34.353000 audit[4350]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc559aea8 a2=40 a3=ffffc559aed8 items=0 ppid=4293 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.353000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 02:09:34.353000 audit: BPF prog-id=199 op=UNLOAD Dec 16 02:09:34.353000 audit[4350]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffc559aed8 items=0 ppid=4293 pid=4350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.353000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 02:09:34.448525 systemd-networkd[1480]: vxlan.calico: Link UP Dec 16 02:09:34.448532 systemd-networkd[1480]: vxlan.calico: Gained carrier Dec 16 02:09:34.456218 containerd[1587]: time="2025-12-16T02:09:34.455980297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75666888-t2jlw,Uid:a4f3ee57-42ce-4008-b96e-85199f6fd632,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:34.462423 containerd[1587]: time="2025-12-16T02:09:34.462233115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-cl8m7,Uid:e4bfcf46-398e-437f-b7f1-81589479eeb2,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:34.480000 audit: BPF prog-id=200 op=LOAD Dec 16 02:09:34.480000 audit[4378]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff706a168 a2=98 a3=fffff706a158 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.480000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.480000 audit: BPF prog-id=200 op=UNLOAD Dec 16 02:09:34.480000 audit[4378]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff706a138 a3=0 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.480000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.482000 audit: BPF prog-id=201 op=LOAD Dec 16 02:09:34.482000 audit[4378]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff7069e48 a2=74 a3=95 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.482000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.482000 audit: BPF prog-id=201 op=UNLOAD Dec 16 02:09:34.482000 audit[4378]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.482000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.482000 audit: BPF prog-id=202 op=LOAD Dec 16 02:09:34.482000 audit[4378]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff7069ea8 a2=94 a3=2 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.482000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.482000 audit: BPF prog-id=202 op=UNLOAD Dec 16 02:09:34.482000 audit[4378]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.482000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.482000 audit: BPF prog-id=203 op=LOAD Dec 16 02:09:34.482000 audit[4378]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff7069d28 a2=40 a3=fffff7069d58 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.482000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.482000 audit: BPF prog-id=203 op=UNLOAD Dec 16 02:09:34.482000 audit[4378]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=fffff7069d58 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.482000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.482000 audit: BPF prog-id=204 op=LOAD Dec 16 02:09:34.482000 audit[4378]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff7069e78 a2=94 a3=b7 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.482000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.482000 audit: BPF prog-id=204 op=UNLOAD Dec 16 02:09:34.482000 audit[4378]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.482000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.484000 audit: BPF prog-id=205 op=LOAD Dec 16 02:09:34.484000 audit[4378]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff7069528 a2=94 a3=2 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.484000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.485000 audit: BPF prog-id=205 op=UNLOAD Dec 16 02:09:34.485000 audit[4378]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.485000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.485000 audit: BPF prog-id=206 op=LOAD Dec 16 02:09:34.485000 audit[4378]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff70696b8 a2=94 a3=30 items=0 ppid=4293 pid=4378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.485000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 02:09:34.495000 audit: BPF prog-id=207 op=LOAD Dec 16 02:09:34.495000 audit[4392]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc70c9e08 a2=98 a3=ffffc70c9df8 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.495000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.495000 audit: BPF prog-id=207 op=UNLOAD Dec 16 02:09:34.495000 audit[4392]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc70c9dd8 a3=0 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.495000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.496000 audit: BPF prog-id=208 op=LOAD Dec 16 02:09:34.496000 audit[4392]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc70c9a98 a2=74 a3=95 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.496000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.496000 audit: BPF prog-id=208 op=UNLOAD Dec 16 02:09:34.496000 audit[4392]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.496000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.496000 audit: BPF prog-id=209 op=LOAD Dec 16 02:09:34.496000 audit[4392]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc70c9af8 a2=94 a3=2 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.496000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.496000 audit: BPF prog-id=209 op=UNLOAD Dec 16 02:09:34.496000 audit[4392]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.496000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.669700 kubelet[2826]: E1216 02:09:34.669585 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:09:34.700162 systemd-networkd[1480]: cali6b9392b4359: Link UP Dec 16 02:09:34.703570 systemd-networkd[1480]: cali6b9392b4359: Gained carrier Dec 16 02:09:34.706000 audit: BPF prog-id=210 op=LOAD Dec 16 02:09:34.706000 audit[4392]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc70c9ab8 a2=40 a3=ffffc70c9ae8 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.706000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.706000 audit: BPF prog-id=210 op=UNLOAD Dec 16 02:09:34.706000 audit[4392]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffc70c9ae8 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.706000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.729000 audit: BPF prog-id=211 op=LOAD Dec 16 02:09:34.729000 audit[4392]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc70c9ac8 a2=94 a3=4 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.729000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.729000 audit: BPF prog-id=211 op=UNLOAD Dec 16 02:09:34.729000 audit[4392]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.729000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.730000 audit: BPF prog-id=212 op=LOAD Dec 16 02:09:34.730000 audit[4392]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc70c9908 a2=94 a3=5 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.730000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.732000 audit: BPF prog-id=212 op=UNLOAD Dec 16 02:09:34.732000 audit[4392]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.732000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.732000 audit: BPF prog-id=213 op=LOAD Dec 16 02:09:34.732000 audit[4392]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc70c9b38 a2=94 a3=6 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.732000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.734000 audit: BPF prog-id=213 op=UNLOAD Dec 16 02:09:34.734000 audit[4392]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.734000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.735000 audit: BPF prog-id=214 op=LOAD Dec 16 02:09:34.735000 audit[4392]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc70c9308 a2=94 a3=83 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.735000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.737000 audit: BPF prog-id=215 op=LOAD Dec 16 02:09:34.737000 audit[4392]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffc70c90c8 a2=94 a3=2 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.737000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.737000 audit: BPF prog-id=215 op=UNLOAD Dec 16 02:09:34.737000 audit[4392]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.737000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.738000 audit: BPF prog-id=214 op=UNLOAD Dec 16 02:09:34.738000 audit[4392]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1d877620 a3=1d86ab00 items=0 ppid=4293 pid=4392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.738000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 02:09:34.742305 containerd[1587]: 2025-12-16 02:09:34.583 [INFO][4391] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0 goldmane-7c778bb748- calico-system e4bfcf46-398e-437f-b7f1-81589479eeb2 819 0 2025-12-16 02:09:09 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-9-be0981937a goldmane-7c778bb748-cl8m7 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali6b9392b4359 [] [] }} ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Namespace="calico-system" Pod="goldmane-7c778bb748-cl8m7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-" Dec 16 02:09:34.742305 containerd[1587]: 2025-12-16 02:09:34.583 [INFO][4391] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Namespace="calico-system" Pod="goldmane-7c778bb748-cl8m7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" Dec 16 02:09:34.742305 containerd[1587]: 2025-12-16 02:09:34.623 [INFO][4413] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" HandleID="k8s-pod-network.d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Workload="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" Dec 16 02:09:34.743523 containerd[1587]: 2025-12-16 02:09:34.623 [INFO][4413] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" HandleID="k8s-pod-network.d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Workload="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b660), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-9-be0981937a", "pod":"goldmane-7c778bb748-cl8m7", "timestamp":"2025-12-16 02:09:34.62306399 +0000 UTC"}, Hostname:"ci-4547-0-0-9-be0981937a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 02:09:34.743523 containerd[1587]: 2025-12-16 02:09:34.623 [INFO][4413] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 02:09:34.743523 containerd[1587]: 2025-12-16 02:09:34.623 [INFO][4413] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 02:09:34.743523 containerd[1587]: 2025-12-16 02:09:34.623 [INFO][4413] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-9-be0981937a' Dec 16 02:09:34.743523 containerd[1587]: 2025-12-16 02:09:34.637 [INFO][4413] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.743523 containerd[1587]: 2025-12-16 02:09:34.648 [INFO][4413] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.743523 containerd[1587]: 2025-12-16 02:09:34.654 [INFO][4413] ipam/ipam.go 511: Trying affinity for 192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.743523 containerd[1587]: 2025-12-16 02:09:34.657 [INFO][4413] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.743523 containerd[1587]: 2025-12-16 02:09:34.661 [INFO][4413] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.743815 containerd[1587]: 2025-12-16 02:09:34.661 [INFO][4413] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.192/26 handle="k8s-pod-network.d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.743815 containerd[1587]: 2025-12-16 02:09:34.664 [INFO][4413] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f Dec 16 02:09:34.743815 containerd[1587]: 2025-12-16 02:09:34.672 [INFO][4413] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.192/26 handle="k8s-pod-network.d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.743815 containerd[1587]: 2025-12-16 02:09:34.682 [INFO][4413] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.195/26] block=192.168.115.192/26 handle="k8s-pod-network.d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.743815 containerd[1587]: 2025-12-16 02:09:34.682 [INFO][4413] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.195/26] handle="k8s-pod-network.d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.743815 containerd[1587]: 2025-12-16 02:09:34.682 [INFO][4413] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 02:09:34.743815 containerd[1587]: 2025-12-16 02:09:34.682 [INFO][4413] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.195/26] IPv6=[] ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" HandleID="k8s-pod-network.d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Workload="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" Dec 16 02:09:34.743987 containerd[1587]: 2025-12-16 02:09:34.688 [INFO][4391] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Namespace="calico-system" Pod="goldmane-7c778bb748-cl8m7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"e4bfcf46-398e-437f-b7f1-81589479eeb2", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"", Pod:"goldmane-7c778bb748-cl8m7", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.115.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali6b9392b4359", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:34.745077 containerd[1587]: 2025-12-16 02:09:34.688 [INFO][4391] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.195/32] ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Namespace="calico-system" Pod="goldmane-7c778bb748-cl8m7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" Dec 16 02:09:34.745077 containerd[1587]: 2025-12-16 02:09:34.689 [INFO][4391] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6b9392b4359 ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Namespace="calico-system" Pod="goldmane-7c778bb748-cl8m7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" Dec 16 02:09:34.745077 containerd[1587]: 2025-12-16 02:09:34.714 [INFO][4391] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Namespace="calico-system" Pod="goldmane-7c778bb748-cl8m7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" Dec 16 02:09:34.745167 containerd[1587]: 2025-12-16 02:09:34.716 [INFO][4391] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Namespace="calico-system" Pod="goldmane-7c778bb748-cl8m7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"e4bfcf46-398e-437f-b7f1-81589479eeb2", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f", Pod:"goldmane-7c778bb748-cl8m7", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.115.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali6b9392b4359", MAC:"9a:9f:84:34:15:df", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:34.745226 containerd[1587]: 2025-12-16 02:09:34.732 [INFO][4391] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" Namespace="calico-system" Pod="goldmane-7c778bb748-cl8m7" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-goldmane--7c778bb748--cl8m7-eth0" Dec 16 02:09:34.745000 audit[4429]: NETFILTER_CFG table=filter:121 family=2 entries=20 op=nft_register_rule pid=4429 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:34.745000 audit[4429]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe3b0f2a0 a2=0 a3=1 items=0 ppid=2986 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.745000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:34.746000 audit[4429]: NETFILTER_CFG table=nat:122 family=2 entries=14 op=nft_register_rule pid=4429 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:34.746000 audit[4429]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffe3b0f2a0 a2=0 a3=1 items=0 ppid=2986 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.746000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:34.755000 audit: BPF prog-id=206 op=UNLOAD Dec 16 02:09:34.755000 audit[4293]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=40005da980 a2=0 a3=0 items=0 ppid=3937 pid=4293 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.755000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Dec 16 02:09:34.788379 containerd[1587]: time="2025-12-16T02:09:34.788336341Z" level=info msg="connecting to shim d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f" address="unix:///run/containerd/s/34ad9846a493088f91f06283c08558d37a376e15730c3196ad13a59b1739a9ae" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:34.817695 systemd-networkd[1480]: cali0bed2431991: Link UP Dec 16 02:09:34.819753 systemd-networkd[1480]: cali0bed2431991: Gained carrier Dec 16 02:09:34.850084 containerd[1587]: 2025-12-16 02:09:34.590 [INFO][4379] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0 calico-kube-controllers-75666888- calico-system a4f3ee57-42ce-4008-b96e-85199f6fd632 815 0 2025-12-16 02:09:13 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:75666888 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-9-be0981937a calico-kube-controllers-75666888-t2jlw eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0bed2431991 [] [] }} ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Namespace="calico-system" Pod="calico-kube-controllers-75666888-t2jlw" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-" Dec 16 02:09:34.850084 containerd[1587]: 2025-12-16 02:09:34.590 [INFO][4379] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Namespace="calico-system" Pod="calico-kube-controllers-75666888-t2jlw" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" Dec 16 02:09:34.850084 containerd[1587]: 2025-12-16 02:09:34.650 [INFO][4418] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" HandleID="k8s-pod-network.10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Workload="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" Dec 16 02:09:34.850359 containerd[1587]: 2025-12-16 02:09:34.651 [INFO][4418] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" HandleID="k8s-pod-network.10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Workload="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-9-be0981937a", "pod":"calico-kube-controllers-75666888-t2jlw", "timestamp":"2025-12-16 02:09:34.650850833 +0000 UTC"}, Hostname:"ci-4547-0-0-9-be0981937a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 02:09:34.850359 containerd[1587]: 2025-12-16 02:09:34.651 [INFO][4418] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 02:09:34.850359 containerd[1587]: 2025-12-16 02:09:34.682 [INFO][4418] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 02:09:34.850359 containerd[1587]: 2025-12-16 02:09:34.683 [INFO][4418] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-9-be0981937a' Dec 16 02:09:34.850359 containerd[1587]: 2025-12-16 02:09:34.736 [INFO][4418] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.850359 containerd[1587]: 2025-12-16 02:09:34.745 [INFO][4418] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.850359 containerd[1587]: 2025-12-16 02:09:34.760 [INFO][4418] ipam/ipam.go 511: Trying affinity for 192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.850359 containerd[1587]: 2025-12-16 02:09:34.765 [INFO][4418] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.850359 containerd[1587]: 2025-12-16 02:09:34.774 [INFO][4418] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.850626 containerd[1587]: 2025-12-16 02:09:34.774 [INFO][4418] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.192/26 handle="k8s-pod-network.10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.850626 containerd[1587]: 2025-12-16 02:09:34.780 [INFO][4418] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7 Dec 16 02:09:34.850626 containerd[1587]: 2025-12-16 02:09:34.790 [INFO][4418] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.192/26 handle="k8s-pod-network.10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.850626 containerd[1587]: 2025-12-16 02:09:34.802 [INFO][4418] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.196/26] block=192.168.115.192/26 handle="k8s-pod-network.10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.850626 containerd[1587]: 2025-12-16 02:09:34.802 [INFO][4418] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.196/26] handle="k8s-pod-network.10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:34.850626 containerd[1587]: 2025-12-16 02:09:34.802 [INFO][4418] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 02:09:34.850626 containerd[1587]: 2025-12-16 02:09:34.802 [INFO][4418] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.196/26] IPv6=[] ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" HandleID="k8s-pod-network.10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Workload="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" Dec 16 02:09:34.850766 containerd[1587]: 2025-12-16 02:09:34.808 [INFO][4379] cni-plugin/k8s.go 418: Populated endpoint ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Namespace="calico-system" Pod="calico-kube-controllers-75666888-t2jlw" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0", GenerateName:"calico-kube-controllers-75666888-", Namespace:"calico-system", SelfLink:"", UID:"a4f3ee57-42ce-4008-b96e-85199f6fd632", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75666888", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"", Pod:"calico-kube-controllers-75666888-t2jlw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.115.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0bed2431991", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:34.850821 containerd[1587]: 2025-12-16 02:09:34.808 [INFO][4379] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.196/32] ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Namespace="calico-system" Pod="calico-kube-controllers-75666888-t2jlw" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" Dec 16 02:09:34.850821 containerd[1587]: 2025-12-16 02:09:34.808 [INFO][4379] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0bed2431991 ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Namespace="calico-system" Pod="calico-kube-controllers-75666888-t2jlw" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" Dec 16 02:09:34.850821 containerd[1587]: 2025-12-16 02:09:34.828 [INFO][4379] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Namespace="calico-system" Pod="calico-kube-controllers-75666888-t2jlw" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" Dec 16 02:09:34.850885 containerd[1587]: 2025-12-16 02:09:34.829 [INFO][4379] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Namespace="calico-system" Pod="calico-kube-controllers-75666888-t2jlw" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0", GenerateName:"calico-kube-controllers-75666888-", Namespace:"calico-system", SelfLink:"", UID:"a4f3ee57-42ce-4008-b96e-85199f6fd632", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75666888", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7", Pod:"calico-kube-controllers-75666888-t2jlw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.115.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0bed2431991", MAC:"ea:94:be:78:88:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:34.850933 containerd[1587]: 2025-12-16 02:09:34.843 [INFO][4379] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" Namespace="calico-system" Pod="calico-kube-controllers-75666888-t2jlw" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--kube--controllers--75666888--t2jlw-eth0" Dec 16 02:09:34.866350 systemd[1]: Started cri-containerd-d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f.scope - libcontainer container d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f. Dec 16 02:09:34.904000 audit: BPF prog-id=216 op=LOAD Dec 16 02:09:34.905000 audit: BPF prog-id=217 op=LOAD Dec 16 02:09:34.905000 audit[4461]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4446 pid=4461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432353030303964326531643937633833663431363563383233393061 Dec 16 02:09:34.905000 audit: BPF prog-id=217 op=UNLOAD Dec 16 02:09:34.905000 audit[4461]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4446 pid=4461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432353030303964326531643937633833663431363563383233393061 Dec 16 02:09:34.905000 audit: BPF prog-id=218 op=LOAD Dec 16 02:09:34.905000 audit[4461]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4446 pid=4461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432353030303964326531643937633833663431363563383233393061 Dec 16 02:09:34.905000 audit: BPF prog-id=219 op=LOAD Dec 16 02:09:34.905000 audit[4461]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4446 pid=4461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432353030303964326531643937633833663431363563383233393061 Dec 16 02:09:34.905000 audit: BPF prog-id=219 op=UNLOAD Dec 16 02:09:34.905000 audit[4461]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4446 pid=4461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432353030303964326531643937633833663431363563383233393061 Dec 16 02:09:34.907000 audit: BPF prog-id=218 op=UNLOAD Dec 16 02:09:34.907000 audit[4461]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4446 pid=4461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432353030303964326531643937633833663431363563383233393061 Dec 16 02:09:34.907000 audit: BPF prog-id=220 op=LOAD Dec 16 02:09:34.907000 audit[4461]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4446 pid=4461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6432353030303964326531643937633833663431363563383233393061 Dec 16 02:09:34.910299 containerd[1587]: time="2025-12-16T02:09:34.910251245Z" level=info msg="connecting to shim 10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7" address="unix:///run/containerd/s/31a72524abc1aede92564efadc116d90a73a3d07fb6b9cd3ffd06106d33847c1" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:34.936000 audit[4520]: NETFILTER_CFG table=mangle:123 family=2 entries=16 op=nft_register_chain pid=4520 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 02:09:34.936000 audit[4520]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffd8333d20 a2=0 a3=ffff8e884fa8 items=0 ppid=4293 pid=4520 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.936000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 02:09:34.955000 audit[4527]: NETFILTER_CFG table=nat:124 family=2 entries=15 op=nft_register_chain pid=4527 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 02:09:34.955000 audit[4527]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffd0394800 a2=0 a3=ffffa8699fa8 items=0 ppid=4293 pid=4527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.955000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 02:09:34.962000 audit[4516]: NETFILTER_CFG table=raw:125 family=2 entries=21 op=nft_register_chain pid=4516 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 02:09:34.962000 audit[4516]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffffc80b10 a2=0 a3=ffffbd5c1fa8 items=0 ppid=4293 pid=4516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.962000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 02:09:34.987740 containerd[1587]: time="2025-12-16T02:09:34.987523354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-cl8m7,Uid:e4bfcf46-398e-437f-b7f1-81589479eeb2,Namespace:calico-system,Attempt:0,} returns sandbox id \"d250009d2e1d97c83f4165c82390a4283f94bebdfca2d0531edd9c75f7e9c48f\"" Dec 16 02:09:34.993216 containerd[1587]: time="2025-12-16T02:09:34.992115588Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 02:09:34.994424 systemd[1]: Started cri-containerd-10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7.scope - libcontainer container 10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7. Dec 16 02:09:34.980000 audit[4536]: NETFILTER_CFG table=filter:126 family=2 entries=136 op=nft_register_chain pid=4536 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 02:09:34.980000 audit[4536]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=78424 a0=3 a1=fffff8d94e30 a2=0 a3=ffffab4a4fa8 items=0 ppid=4293 pid=4536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:34.980000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 02:09:35.030000 audit: BPF prog-id=221 op=LOAD Dec 16 02:09:35.031000 audit: BPF prog-id=222 op=LOAD Dec 16 02:09:35.031000 audit[4519]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=4504 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.031000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130626462393964383039653735393333623762366364393866323634 Dec 16 02:09:35.031000 audit: BPF prog-id=222 op=UNLOAD Dec 16 02:09:35.031000 audit[4519]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4504 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.031000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130626462393964383039653735393333623762366364393866323634 Dec 16 02:09:35.031000 audit: BPF prog-id=223 op=LOAD Dec 16 02:09:35.031000 audit[4519]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=4504 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.031000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130626462393964383039653735393333623762366364393866323634 Dec 16 02:09:35.032000 audit: BPF prog-id=224 op=LOAD Dec 16 02:09:35.032000 audit[4519]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=4504 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.032000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130626462393964383039653735393333623762366364393866323634 Dec 16 02:09:35.032000 audit: BPF prog-id=224 op=UNLOAD Dec 16 02:09:35.032000 audit[4519]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4504 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.032000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130626462393964383039653735393333623762366364393866323634 Dec 16 02:09:35.032000 audit: BPF prog-id=223 op=UNLOAD Dec 16 02:09:35.032000 audit[4519]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4504 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.032000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130626462393964383039653735393333623762366364393866323634 Dec 16 02:09:35.033000 audit: BPF prog-id=225 op=LOAD Dec 16 02:09:35.033000 audit[4519]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=4504 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.033000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130626462393964383039653735393333623762366364393866323634 Dec 16 02:09:35.078317 containerd[1587]: time="2025-12-16T02:09:35.078272650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75666888-t2jlw,Uid:a4f3ee57-42ce-4008-b96e-85199f6fd632,Namespace:calico-system,Attempt:0,} returns sandbox id \"10bdb99d809e75933b7b6cd98f2644e2a4e77940edd2fe620b0951b0dd4b80b7\"" Dec 16 02:09:35.086000 audit[4564]: NETFILTER_CFG table=filter:127 family=2 entries=76 op=nft_register_chain pid=4564 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 02:09:35.086000 audit[4564]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=43044 a0=3 a1=ffffc5dfd4f0 a2=0 a3=ffff8e92dfa8 items=0 ppid=4293 pid=4564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.086000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 02:09:35.231242 systemd-networkd[1480]: calia3e5330b1b4: Gained IPv6LL Dec 16 02:09:35.359941 containerd[1587]: time="2025-12-16T02:09:35.359857569Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:35.361382 containerd[1587]: time="2025-12-16T02:09:35.361332638Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 02:09:35.361788 containerd[1587]: time="2025-12-16T02:09:35.361418278Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:35.361861 kubelet[2826]: E1216 02:09:35.361625 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:09:35.361861 kubelet[2826]: E1216 02:09:35.361711 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:09:35.362934 kubelet[2826]: E1216 02:09:35.362474 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-cl8m7_calico-system(e4bfcf46-398e-437f-b7f1-81589479eeb2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:35.362934 kubelet[2826]: E1216 02:09:35.362511 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:09:35.364373 containerd[1587]: time="2025-12-16T02:09:35.364277937Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 02:09:35.456403 containerd[1587]: time="2025-12-16T02:09:35.456350816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cf67c95b-xtmz8,Uid:24ee6e1b-64f9-47f5-86c3-17009d2e74c9,Namespace:calico-apiserver,Attempt:0,}" Dec 16 02:09:35.603935 systemd-networkd[1480]: calib9e7e02d694: Link UP Dec 16 02:09:35.605654 systemd-networkd[1480]: calib9e7e02d694: Gained carrier Dec 16 02:09:35.631663 containerd[1587]: 2025-12-16 02:09:35.514 [INFO][4569] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0 calico-apiserver-86cf67c95b- calico-apiserver 24ee6e1b-64f9-47f5-86c3-17009d2e74c9 816 0 2025-12-16 02:09:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:86cf67c95b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-9-be0981937a calico-apiserver-86cf67c95b-xtmz8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib9e7e02d694 [] [] }} ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-xtmz8" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-" Dec 16 02:09:35.631663 containerd[1587]: 2025-12-16 02:09:35.514 [INFO][4569] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-xtmz8" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" Dec 16 02:09:35.631663 containerd[1587]: 2025-12-16 02:09:35.543 [INFO][4579] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" HandleID="k8s-pod-network.d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Workload="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" Dec 16 02:09:35.631926 containerd[1587]: 2025-12-16 02:09:35.544 [INFO][4579] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" HandleID="k8s-pod-network.d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Workload="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b170), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-9-be0981937a", "pod":"calico-apiserver-86cf67c95b-xtmz8", "timestamp":"2025-12-16 02:09:35.54382041 +0000 UTC"}, Hostname:"ci-4547-0-0-9-be0981937a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 02:09:35.631926 containerd[1587]: 2025-12-16 02:09:35.544 [INFO][4579] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 02:09:35.631926 containerd[1587]: 2025-12-16 02:09:35.544 [INFO][4579] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 02:09:35.631926 containerd[1587]: 2025-12-16 02:09:35.544 [INFO][4579] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-9-be0981937a' Dec 16 02:09:35.631926 containerd[1587]: 2025-12-16 02:09:35.559 [INFO][4579] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:35.631926 containerd[1587]: 2025-12-16 02:09:35.566 [INFO][4579] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:35.631926 containerd[1587]: 2025-12-16 02:09:35.572 [INFO][4579] ipam/ipam.go 511: Trying affinity for 192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:35.631926 containerd[1587]: 2025-12-16 02:09:35.575 [INFO][4579] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:35.631926 containerd[1587]: 2025-12-16 02:09:35.578 [INFO][4579] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:35.632381 containerd[1587]: 2025-12-16 02:09:35.578 [INFO][4579] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.192/26 handle="k8s-pod-network.d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:35.632381 containerd[1587]: 2025-12-16 02:09:35.581 [INFO][4579] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424 Dec 16 02:09:35.632381 containerd[1587]: 2025-12-16 02:09:35.586 [INFO][4579] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.192/26 handle="k8s-pod-network.d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:35.632381 containerd[1587]: 2025-12-16 02:09:35.594 [INFO][4579] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.197/26] block=192.168.115.192/26 handle="k8s-pod-network.d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:35.632381 containerd[1587]: 2025-12-16 02:09:35.595 [INFO][4579] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.197/26] handle="k8s-pod-network.d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:35.632381 containerd[1587]: 2025-12-16 02:09:35.595 [INFO][4579] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 02:09:35.632381 containerd[1587]: 2025-12-16 02:09:35.595 [INFO][4579] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.197/26] IPv6=[] ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" HandleID="k8s-pod-network.d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Workload="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" Dec 16 02:09:35.632631 containerd[1587]: 2025-12-16 02:09:35.598 [INFO][4569] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-xtmz8" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0", GenerateName:"calico-apiserver-86cf67c95b-", Namespace:"calico-apiserver", SelfLink:"", UID:"24ee6e1b-64f9-47f5-86c3-17009d2e74c9", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86cf67c95b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"", Pod:"calico-apiserver-86cf67c95b-xtmz8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.115.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib9e7e02d694", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:35.633170 containerd[1587]: 2025-12-16 02:09:35.598 [INFO][4569] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.197/32] ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-xtmz8" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" Dec 16 02:09:35.633170 containerd[1587]: 2025-12-16 02:09:35.598 [INFO][4569] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib9e7e02d694 ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-xtmz8" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" Dec 16 02:09:35.633170 containerd[1587]: 2025-12-16 02:09:35.610 [INFO][4569] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-xtmz8" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" Dec 16 02:09:35.633274 containerd[1587]: 2025-12-16 02:09:35.611 [INFO][4569] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-xtmz8" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0", GenerateName:"calico-apiserver-86cf67c95b-", Namespace:"calico-apiserver", SelfLink:"", UID:"24ee6e1b-64f9-47f5-86c3-17009d2e74c9", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86cf67c95b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424", Pod:"calico-apiserver-86cf67c95b-xtmz8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.115.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib9e7e02d694", MAC:"06:e2:91:9a:b6:60", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:35.633737 containerd[1587]: 2025-12-16 02:09:35.626 [INFO][4569] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" Namespace="calico-apiserver" Pod="calico-apiserver-86cf67c95b-xtmz8" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-calico--apiserver--86cf67c95b--xtmz8-eth0" Dec 16 02:09:35.669000 audit[4594]: NETFILTER_CFG table=filter:128 family=2 entries=49 op=nft_register_chain pid=4594 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 02:09:35.669000 audit[4594]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=25452 a0=3 a1=ffffe9f72fc0 a2=0 a3=ffffac78dfa8 items=0 ppid=4293 pid=4594 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.669000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 02:09:35.682057 containerd[1587]: time="2025-12-16T02:09:35.681719111Z" level=info msg="connecting to shim d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424" address="unix:///run/containerd/s/cc7ac00f3315255934f7c189e0f9dc0647b7df7672c0c0a29634ca6a781bb152" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:35.708329 kubelet[2826]: E1216 02:09:35.708272 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:09:35.710575 kubelet[2826]: E1216 02:09:35.710516 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:09:35.713203 containerd[1587]: time="2025-12-16T02:09:35.712950240Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:35.715440 containerd[1587]: time="2025-12-16T02:09:35.715297942Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 02:09:35.715440 containerd[1587]: time="2025-12-16T02:09:35.715376342Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:35.717661 kubelet[2826]: E1216 02:09:35.717572 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 02:09:35.718006 kubelet[2826]: E1216 02:09:35.717844 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 02:09:35.718166 kubelet[2826]: E1216 02:09:35.718129 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-75666888-t2jlw_calico-system(a4f3ee57-42ce-4008-b96e-85199f6fd632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:35.718373 kubelet[2826]: E1216 02:09:35.718310 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:09:35.753434 systemd[1]: Started cri-containerd-d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424.scope - libcontainer container d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424. Dec 16 02:09:35.778000 audit[4636]: NETFILTER_CFG table=filter:129 family=2 entries=20 op=nft_register_rule pid=4636 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:35.778000 audit[4636]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffdbf52330 a2=0 a3=1 items=0 ppid=2986 pid=4636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.778000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:35.784000 audit[4636]: NETFILTER_CFG table=nat:130 family=2 entries=14 op=nft_register_rule pid=4636 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:35.784000 audit[4636]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffdbf52330 a2=0 a3=1 items=0 ppid=2986 pid=4636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.786000 audit: BPF prog-id=226 op=LOAD Dec 16 02:09:35.784000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:35.786000 audit: BPF prog-id=227 op=LOAD Dec 16 02:09:35.786000 audit[4615]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4604 pid=4615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.786000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439306366373562323262633334303766343530383734643531643136 Dec 16 02:09:35.786000 audit: BPF prog-id=227 op=UNLOAD Dec 16 02:09:35.786000 audit[4615]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4604 pid=4615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.786000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439306366373562323262633334303766343530383734643531643136 Dec 16 02:09:35.787000 audit: BPF prog-id=228 op=LOAD Dec 16 02:09:35.787000 audit[4615]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4604 pid=4615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.787000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439306366373562323262633334303766343530383734643531643136 Dec 16 02:09:35.787000 audit: BPF prog-id=229 op=LOAD Dec 16 02:09:35.787000 audit[4615]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4604 pid=4615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.787000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439306366373562323262633334303766343530383734643531643136 Dec 16 02:09:35.787000 audit: BPF prog-id=229 op=UNLOAD Dec 16 02:09:35.787000 audit[4615]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4604 pid=4615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.787000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439306366373562323262633334303766343530383734643531643136 Dec 16 02:09:35.787000 audit: BPF prog-id=228 op=UNLOAD Dec 16 02:09:35.787000 audit[4615]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4604 pid=4615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.787000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439306366373562323262633334303766343530383734643531643136 Dec 16 02:09:35.787000 audit: BPF prog-id=230 op=LOAD Dec 16 02:09:35.787000 audit[4615]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4604 pid=4615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:35.787000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439306366373562323262633334303766343530383734643531643136 Dec 16 02:09:35.819000 containerd[1587]: time="2025-12-16T02:09:35.818953056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86cf67c95b-xtmz8,Uid:24ee6e1b-64f9-47f5-86c3-17009d2e74c9,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d90cf75b22bc3407f450874d51d16d2e2b1d07ddefa1f6938e9c4e0a5ca90424\"" Dec 16 02:09:35.823052 containerd[1587]: time="2025-12-16T02:09:35.822986146Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 02:09:35.872138 systemd-networkd[1480]: vxlan.calico: Gained IPv6LL Dec 16 02:09:36.174640 containerd[1587]: time="2025-12-16T02:09:36.174305666Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:36.176668 containerd[1587]: time="2025-12-16T02:09:36.176459856Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 02:09:36.176668 containerd[1587]: time="2025-12-16T02:09:36.176613375Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:36.176971 kubelet[2826]: E1216 02:09:36.176919 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:09:36.177112 kubelet[2826]: E1216 02:09:36.176994 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:09:36.177672 kubelet[2826]: E1216 02:09:36.177147 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-86cf67c95b-xtmz8_calico-apiserver(24ee6e1b-64f9-47f5-86c3-17009d2e74c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:36.177672 kubelet[2826]: E1216 02:09:36.177195 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:09:36.451539 containerd[1587]: time="2025-12-16T02:09:36.451469433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9kfwg,Uid:ee5c1b74-6a31-486f-9498-a4be28b35a8a,Namespace:kube-system,Attempt:0,}" Dec 16 02:09:36.453210 containerd[1587]: time="2025-12-16T02:09:36.453142864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-572qq,Uid:3d6c2c9e-73af-4dcd-8b45-11a259f9bc11,Namespace:kube-system,Attempt:0,}" Dec 16 02:09:36.511191 systemd-networkd[1480]: cali0bed2431991: Gained IPv6LL Dec 16 02:09:36.639916 systemd-networkd[1480]: cali6b9392b4359: Gained IPv6LL Dec 16 02:09:36.694764 systemd-networkd[1480]: cali15a42eafa1c: Link UP Dec 16 02:09:36.696994 systemd-networkd[1480]: cali15a42eafa1c: Gained carrier Dec 16 02:09:36.721877 kubelet[2826]: E1216 02:09:36.721667 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:09:36.723662 kubelet[2826]: E1216 02:09:36.722669 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:09:36.725635 kubelet[2826]: E1216 02:09:36.725533 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:09:36.729014 containerd[1587]: 2025-12-16 02:09:36.540 [INFO][4644] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0 coredns-66bc5c9577- kube-system ee5c1b74-6a31-486f-9498-a4be28b35a8a 817 0 2025-12-16 02:08:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-9-be0981937a coredns-66bc5c9577-9kfwg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali15a42eafa1c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Namespace="kube-system" Pod="coredns-66bc5c9577-9kfwg" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-" Dec 16 02:09:36.729014 containerd[1587]: 2025-12-16 02:09:36.540 [INFO][4644] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Namespace="kube-system" Pod="coredns-66bc5c9577-9kfwg" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" Dec 16 02:09:36.729014 containerd[1587]: 2025-12-16 02:09:36.600 [INFO][4670] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" HandleID="k8s-pod-network.f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Workload="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" Dec 16 02:09:36.729218 containerd[1587]: 2025-12-16 02:09:36.602 [INFO][4670] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" HandleID="k8s-pod-network.f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Workload="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d36d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-9-be0981937a", "pod":"coredns-66bc5c9577-9kfwg", "timestamp":"2025-12-16 02:09:36.600872663 +0000 UTC"}, Hostname:"ci-4547-0-0-9-be0981937a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 02:09:36.729218 containerd[1587]: 2025-12-16 02:09:36.602 [INFO][4670] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 02:09:36.729218 containerd[1587]: 2025-12-16 02:09:36.602 [INFO][4670] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 02:09:36.729218 containerd[1587]: 2025-12-16 02:09:36.602 [INFO][4670] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-9-be0981937a' Dec 16 02:09:36.729218 containerd[1587]: 2025-12-16 02:09:36.617 [INFO][4670] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.729218 containerd[1587]: 2025-12-16 02:09:36.624 [INFO][4670] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.729218 containerd[1587]: 2025-12-16 02:09:36.634 [INFO][4670] ipam/ipam.go 511: Trying affinity for 192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.729218 containerd[1587]: 2025-12-16 02:09:36.639 [INFO][4670] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.729218 containerd[1587]: 2025-12-16 02:09:36.644 [INFO][4670] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.730192 containerd[1587]: 2025-12-16 02:09:36.644 [INFO][4670] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.192/26 handle="k8s-pod-network.f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.730192 containerd[1587]: 2025-12-16 02:09:36.650 [INFO][4670] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081 Dec 16 02:09:36.730192 containerd[1587]: 2025-12-16 02:09:36.658 [INFO][4670] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.192/26 handle="k8s-pod-network.f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.730192 containerd[1587]: 2025-12-16 02:09:36.670 [INFO][4670] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.198/26] block=192.168.115.192/26 handle="k8s-pod-network.f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.730192 containerd[1587]: 2025-12-16 02:09:36.671 [INFO][4670] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.198/26] handle="k8s-pod-network.f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.730192 containerd[1587]: 2025-12-16 02:09:36.671 [INFO][4670] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 02:09:36.730192 containerd[1587]: 2025-12-16 02:09:36.671 [INFO][4670] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.198/26] IPv6=[] ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" HandleID="k8s-pod-network.f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Workload="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" Dec 16 02:09:36.730364 containerd[1587]: 2025-12-16 02:09:36.676 [INFO][4644] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Namespace="kube-system" Pod="coredns-66bc5c9577-9kfwg" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ee5c1b74-6a31-486f-9498-a4be28b35a8a", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 8, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"", Pod:"coredns-66bc5c9577-9kfwg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.115.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali15a42eafa1c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:36.730364 containerd[1587]: 2025-12-16 02:09:36.677 [INFO][4644] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.198/32] ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Namespace="kube-system" Pod="coredns-66bc5c9577-9kfwg" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" Dec 16 02:09:36.730364 containerd[1587]: 2025-12-16 02:09:36.677 [INFO][4644] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali15a42eafa1c ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Namespace="kube-system" Pod="coredns-66bc5c9577-9kfwg" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" Dec 16 02:09:36.730364 containerd[1587]: 2025-12-16 02:09:36.698 [INFO][4644] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Namespace="kube-system" Pod="coredns-66bc5c9577-9kfwg" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" Dec 16 02:09:36.730364 containerd[1587]: 2025-12-16 02:09:36.698 [INFO][4644] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Namespace="kube-system" Pod="coredns-66bc5c9577-9kfwg" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ee5c1b74-6a31-486f-9498-a4be28b35a8a", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 8, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081", Pod:"coredns-66bc5c9577-9kfwg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.115.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali15a42eafa1c", MAC:"ea:09:f4:75:d0:ad", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:36.730594 containerd[1587]: 2025-12-16 02:09:36.718 [INFO][4644] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" Namespace="kube-system" Pod="coredns-66bc5c9577-9kfwg" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--9kfwg-eth0" Dec 16 02:09:36.795314 containerd[1587]: time="2025-12-16T02:09:36.795264193Z" level=info msg="connecting to shim f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081" address="unix:///run/containerd/s/5486ad2c0049091b68f674ff95306a3af50c8f6d81f01c356f0227ab1d33b111" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:36.841692 systemd[1]: Started cri-containerd-f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081.scope - libcontainer container f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081. Dec 16 02:09:36.852469 systemd-networkd[1480]: caliea3ef94003c: Link UP Dec 16 02:09:36.853013 systemd-networkd[1480]: caliea3ef94003c: Gained carrier Dec 16 02:09:36.856000 audit[4725]: NETFILTER_CFG table=filter:131 family=2 entries=20 op=nft_register_rule pid=4725 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:36.856000 audit[4725]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffffba16850 a2=0 a3=1 items=0 ppid=2986 pid=4725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.856000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:36.859000 audit[4724]: NETFILTER_CFG table=filter:132 family=2 entries=58 op=nft_register_chain pid=4724 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 02:09:36.859000 audit[4724]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=27304 a0=3 a1=ffffe43032a0 a2=0 a3=ffffb2efdfa8 items=0 ppid=4293 pid=4724 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.859000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 02:09:36.869000 audit[4725]: NETFILTER_CFG table=nat:133 family=2 entries=14 op=nft_register_rule pid=4725 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:36.869000 audit[4725]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=fffffba16850 a2=0 a3=1 items=0 ppid=2986 pid=4725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.869000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:36.889000 audit: BPF prog-id=231 op=LOAD Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.570 [INFO][4655] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0 coredns-66bc5c9577- kube-system 3d6c2c9e-73af-4dcd-8b45-11a259f9bc11 813 0 2025-12-16 02:08:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-9-be0981937a coredns-66bc5c9577-572qq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliea3ef94003c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Namespace="kube-system" Pod="coredns-66bc5c9577-572qq" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.570 [INFO][4655] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Namespace="kube-system" Pod="coredns-66bc5c9577-572qq" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.636 [INFO][4675] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" HandleID="k8s-pod-network.88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Workload="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.636 [INFO][4675] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" HandleID="k8s-pod-network.88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Workload="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b010), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-9-be0981937a", "pod":"coredns-66bc5c9577-572qq", "timestamp":"2025-12-16 02:09:36.63636001 +0000 UTC"}, Hostname:"ci-4547-0-0-9-be0981937a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.636 [INFO][4675] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.671 [INFO][4675] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.671 [INFO][4675] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-9-be0981937a' Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.725 [INFO][4675] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.746 [INFO][4675] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.760 [INFO][4675] ipam/ipam.go 511: Trying affinity for 192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.777 [INFO][4675] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.784 [INFO][4675] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.784 [INFO][4675] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.192/26 handle="k8s-pod-network.88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.797 [INFO][4675] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8 Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.814 [INFO][4675] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.192/26 handle="k8s-pod-network.88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.828 [INFO][4675] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.199/26] block=192.168.115.192/26 handle="k8s-pod-network.88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.828 [INFO][4675] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.199/26] handle="k8s-pod-network.88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.828 [INFO][4675] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 02:09:36.889797 containerd[1587]: 2025-12-16 02:09:36.828 [INFO][4675] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.199/26] IPv6=[] ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" HandleID="k8s-pod-network.88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Workload="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" Dec 16 02:09:36.891793 containerd[1587]: 2025-12-16 02:09:36.845 [INFO][4655] cni-plugin/k8s.go 418: Populated endpoint ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Namespace="kube-system" Pod="coredns-66bc5c9577-572qq" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"3d6c2c9e-73af-4dcd-8b45-11a259f9bc11", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 8, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"", Pod:"coredns-66bc5c9577-572qq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.115.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliea3ef94003c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:36.891793 containerd[1587]: 2025-12-16 02:09:36.845 [INFO][4655] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.199/32] ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Namespace="kube-system" Pod="coredns-66bc5c9577-572qq" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" Dec 16 02:09:36.891793 containerd[1587]: 2025-12-16 02:09:36.845 [INFO][4655] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliea3ef94003c ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Namespace="kube-system" Pod="coredns-66bc5c9577-572qq" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" Dec 16 02:09:36.891793 containerd[1587]: 2025-12-16 02:09:36.851 [INFO][4655] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Namespace="kube-system" Pod="coredns-66bc5c9577-572qq" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" Dec 16 02:09:36.891793 containerd[1587]: 2025-12-16 02:09:36.855 [INFO][4655] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Namespace="kube-system" Pod="coredns-66bc5c9577-572qq" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"3d6c2c9e-73af-4dcd-8b45-11a259f9bc11", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 8, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8", Pod:"coredns-66bc5c9577-572qq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.115.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliea3ef94003c", MAC:"5a:97:0a:b0:c1:03", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:36.891968 containerd[1587]: 2025-12-16 02:09:36.876 [INFO][4655] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" Namespace="kube-system" Pod="coredns-66bc5c9577-572qq" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-coredns--66bc5c9577--572qq-eth0" Dec 16 02:09:36.892000 audit: BPF prog-id=232 op=LOAD Dec 16 02:09:36.892000 audit[4711]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4700 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632333034323339393966613564303932363932306630353361303339 Dec 16 02:09:36.892000 audit: BPF prog-id=232 op=UNLOAD Dec 16 02:09:36.892000 audit[4711]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4700 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632333034323339393966613564303932363932306630353361303339 Dec 16 02:09:36.894000 audit: BPF prog-id=233 op=LOAD Dec 16 02:09:36.894000 audit[4711]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4700 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.894000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632333034323339393966613564303932363932306630353361303339 Dec 16 02:09:36.894000 audit: BPF prog-id=234 op=LOAD Dec 16 02:09:36.894000 audit[4711]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4700 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.894000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632333034323339393966613564303932363932306630353361303339 Dec 16 02:09:36.894000 audit: BPF prog-id=234 op=UNLOAD Dec 16 02:09:36.894000 audit[4711]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4700 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.894000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632333034323339393966613564303932363932306630353361303339 Dec 16 02:09:36.895000 audit: BPF prog-id=233 op=UNLOAD Dec 16 02:09:36.895000 audit[4711]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4700 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.895000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632333034323339393966613564303932363932306630353361303339 Dec 16 02:09:36.895000 audit: BPF prog-id=235 op=LOAD Dec 16 02:09:36.895000 audit[4711]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4700 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.895000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632333034323339393966613564303932363932306630353361303339 Dec 16 02:09:36.928000 audit[4742]: NETFILTER_CFG table=filter:134 family=2 entries=52 op=nft_register_chain pid=4742 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 02:09:36.928000 audit[4742]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23908 a0=3 a1=ffffc2d51bf0 a2=0 a3=ffffa9ee7fa8 items=0 ppid=4293 pid=4742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:36.928000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 02:09:36.936297 containerd[1587]: time="2025-12-16T02:09:36.936237185Z" level=info msg="connecting to shim 88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8" address="unix:///run/containerd/s/6f0d155c1a830d5c6632f64a769b6a8f7164df1372f2337ce8aef5dcd492c27b" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:36.954677 containerd[1587]: time="2025-12-16T02:09:36.954632455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9kfwg,Uid:ee5c1b74-6a31-486f-9498-a4be28b35a8a,Namespace:kube-system,Attempt:0,} returns sandbox id \"f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081\"" Dec 16 02:09:36.964932 containerd[1587]: time="2025-12-16T02:09:36.964512727Z" level=info msg="CreateContainer within sandbox \"f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 02:09:36.980744 containerd[1587]: time="2025-12-16T02:09:36.980631848Z" level=info msg="Container 9d2cea5338ad5d73359262ad4fa4f49448d67cb6d86818844978774fe569a76a: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:09:36.990290 systemd[1]: Started cri-containerd-88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8.scope - libcontainer container 88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8. Dec 16 02:09:36.992981 containerd[1587]: time="2025-12-16T02:09:36.992843548Z" level=info msg="CreateContainer within sandbox \"f230423999fa5d0926920f053a039c1992ca4a9b1769cf5eca9d5aaac120f081\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9d2cea5338ad5d73359262ad4fa4f49448d67cb6d86818844978774fe569a76a\"" Dec 16 02:09:36.994315 containerd[1587]: time="2025-12-16T02:09:36.994250781Z" level=info msg="StartContainer for \"9d2cea5338ad5d73359262ad4fa4f49448d67cb6d86818844978774fe569a76a\"" Dec 16 02:09:36.997142 containerd[1587]: time="2025-12-16T02:09:36.996806569Z" level=info msg="connecting to shim 9d2cea5338ad5d73359262ad4fa4f49448d67cb6d86818844978774fe569a76a" address="unix:///run/containerd/s/5486ad2c0049091b68f674ff95306a3af50c8f6d81f01c356f0227ab1d33b111" protocol=ttrpc version=3 Dec 16 02:09:37.013000 audit: BPF prog-id=236 op=LOAD Dec 16 02:09:37.014000 audit: BPF prog-id=237 op=LOAD Dec 16 02:09:37.014000 audit[4770]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4752 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.014000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838613036303663303363353861363366303566383538616466613238 Dec 16 02:09:37.015000 audit: BPF prog-id=237 op=UNLOAD Dec 16 02:09:37.015000 audit[4770]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4752 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.015000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838613036303663303363353861363366303566383538616466613238 Dec 16 02:09:37.016000 audit: BPF prog-id=238 op=LOAD Dec 16 02:09:37.016000 audit[4770]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4752 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.016000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838613036303663303363353861363366303566383538616466613238 Dec 16 02:09:37.017000 audit: BPF prog-id=239 op=LOAD Dec 16 02:09:37.017000 audit[4770]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4752 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.017000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838613036303663303363353861363366303566383538616466613238 Dec 16 02:09:37.017000 audit: BPF prog-id=239 op=UNLOAD Dec 16 02:09:37.017000 audit[4770]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4752 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.017000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838613036303663303363353861363366303566383538616466613238 Dec 16 02:09:37.017000 audit: BPF prog-id=238 op=UNLOAD Dec 16 02:09:37.017000 audit[4770]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4752 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.017000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838613036303663303363353861363366303566383538616466613238 Dec 16 02:09:37.017000 audit: BPF prog-id=240 op=LOAD Dec 16 02:09:37.017000 audit[4770]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4752 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.017000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838613036303663303363353861363366303566383538616466613238 Dec 16 02:09:37.028707 systemd[1]: Started cri-containerd-9d2cea5338ad5d73359262ad4fa4f49448d67cb6d86818844978774fe569a76a.scope - libcontainer container 9d2cea5338ad5d73359262ad4fa4f49448d67cb6d86818844978774fe569a76a. Dec 16 02:09:37.062000 audit: BPF prog-id=241 op=LOAD Dec 16 02:09:37.063000 audit: BPF prog-id=242 op=LOAD Dec 16 02:09:37.063000 audit[4784]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=4700 pid=4784 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.063000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964326365613533333861643564373333353932363261643466613466 Dec 16 02:09:37.063000 audit: BPF prog-id=242 op=UNLOAD Dec 16 02:09:37.063000 audit[4784]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4700 pid=4784 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.063000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964326365613533333861643564373333353932363261643466613466 Dec 16 02:09:37.064000 audit: BPF prog-id=243 op=LOAD Dec 16 02:09:37.064000 audit[4784]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=4700 pid=4784 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.064000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964326365613533333861643564373333353932363261643466613466 Dec 16 02:09:37.064000 audit: BPF prog-id=244 op=LOAD Dec 16 02:09:37.064000 audit[4784]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=4700 pid=4784 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.064000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964326365613533333861643564373333353932363261643466613466 Dec 16 02:09:37.064000 audit: BPF prog-id=244 op=UNLOAD Dec 16 02:09:37.064000 audit[4784]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4700 pid=4784 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.064000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964326365613533333861643564373333353932363261643466613466 Dec 16 02:09:37.064000 audit: BPF prog-id=243 op=UNLOAD Dec 16 02:09:37.064000 audit[4784]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4700 pid=4784 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.064000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964326365613533333861643564373333353932363261643466613466 Dec 16 02:09:37.064000 audit: BPF prog-id=245 op=LOAD Dec 16 02:09:37.064000 audit[4784]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=4700 pid=4784 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.064000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964326365613533333861643564373333353932363261643466613466 Dec 16 02:09:37.067880 containerd[1587]: time="2025-12-16T02:09:37.067838507Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-572qq,Uid:3d6c2c9e-73af-4dcd-8b45-11a259f9bc11,Namespace:kube-system,Attempt:0,} returns sandbox id \"88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8\"" Dec 16 02:09:37.076855 containerd[1587]: time="2025-12-16T02:09:37.076396886Z" level=info msg="CreateContainer within sandbox \"88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 02:09:37.088527 containerd[1587]: time="2025-12-16T02:09:37.088234017Z" level=info msg="Container 57f5ee8ab6e1ed7eb0d01b34cdd809d4f5cf1273d673da30caf66e9a2bdd1ab1: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:09:37.095843 containerd[1587]: time="2025-12-16T02:09:37.095553919Z" level=info msg="StartContainer for \"9d2cea5338ad5d73359262ad4fa4f49448d67cb6d86818844978774fe569a76a\" returns successfully" Dec 16 02:09:37.097160 containerd[1587]: time="2025-12-16T02:09:37.097113995Z" level=info msg="CreateContainer within sandbox \"88a0606c03c58a63f05f858adfa28d129e58a6b90f844a435a3b8761d5aaeff8\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"57f5ee8ab6e1ed7eb0d01b34cdd809d4f5cf1273d673da30caf66e9a2bdd1ab1\"" Dec 16 02:09:37.098191 containerd[1587]: time="2025-12-16T02:09:37.098104032Z" level=info msg="StartContainer for \"57f5ee8ab6e1ed7eb0d01b34cdd809d4f5cf1273d673da30caf66e9a2bdd1ab1\"" Dec 16 02:09:37.099767 containerd[1587]: time="2025-12-16T02:09:37.099721428Z" level=info msg="connecting to shim 57f5ee8ab6e1ed7eb0d01b34cdd809d4f5cf1273d673da30caf66e9a2bdd1ab1" address="unix:///run/containerd/s/6f0d155c1a830d5c6632f64a769b6a8f7164df1372f2337ce8aef5dcd492c27b" protocol=ttrpc version=3 Dec 16 02:09:37.124303 systemd[1]: Started cri-containerd-57f5ee8ab6e1ed7eb0d01b34cdd809d4f5cf1273d673da30caf66e9a2bdd1ab1.scope - libcontainer container 57f5ee8ab6e1ed7eb0d01b34cdd809d4f5cf1273d673da30caf66e9a2bdd1ab1. Dec 16 02:09:37.153000 audit: BPF prog-id=246 op=LOAD Dec 16 02:09:37.153000 audit: BPF prog-id=247 op=LOAD Dec 16 02:09:37.153000 audit[4824]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4752 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.153000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537663565653861623665316564376562306430316233346364643830 Dec 16 02:09:37.153000 audit: BPF prog-id=247 op=UNLOAD Dec 16 02:09:37.153000 audit[4824]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4752 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.153000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537663565653861623665316564376562306430316233346364643830 Dec 16 02:09:37.153000 audit: BPF prog-id=248 op=LOAD Dec 16 02:09:37.153000 audit[4824]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4752 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.153000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537663565653861623665316564376562306430316233346364643830 Dec 16 02:09:37.154000 audit: BPF prog-id=249 op=LOAD Dec 16 02:09:37.154000 audit[4824]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4752 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537663565653861623665316564376562306430316233346364643830 Dec 16 02:09:37.154000 audit: BPF prog-id=249 op=UNLOAD Dec 16 02:09:37.154000 audit[4824]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4752 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537663565653861623665316564376562306430316233346364643830 Dec 16 02:09:37.154000 audit: BPF prog-id=248 op=UNLOAD Dec 16 02:09:37.154000 audit[4824]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4752 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537663565653861623665316564376562306430316233346364643830 Dec 16 02:09:37.154000 audit: BPF prog-id=250 op=LOAD Dec 16 02:09:37.154000 audit[4824]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4752 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537663565653861623665316564376562306430316233346364643830 Dec 16 02:09:37.185445 containerd[1587]: time="2025-12-16T02:09:37.185391938Z" level=info msg="StartContainer for \"57f5ee8ab6e1ed7eb0d01b34cdd809d4f5cf1273d673da30caf66e9a2bdd1ab1\" returns successfully" Dec 16 02:09:37.407483 systemd-networkd[1480]: calib9e7e02d694: Gained IPv6LL Dec 16 02:09:37.451764 containerd[1587]: time="2025-12-16T02:09:37.451700444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rp5zk,Uid:919dd2b2-2bc2-4394-9fed-3f9f47f938e5,Namespace:calico-system,Attempt:0,}" Dec 16 02:09:37.600885 systemd-networkd[1480]: cali23118d2e215: Link UP Dec 16 02:09:37.601174 systemd-networkd[1480]: cali23118d2e215: Gained carrier Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.501 [INFO][4859] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0 csi-node-driver- calico-system 919dd2b2-2bc2-4394-9fed-3f9f47f938e5 719 0 2025-12-16 02:09:13 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-9-be0981937a csi-node-driver-rp5zk eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali23118d2e215 [] [] }} ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Namespace="calico-system" Pod="csi-node-driver-rp5zk" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.501 [INFO][4859] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Namespace="calico-system" Pod="csi-node-driver-rp5zk" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.536 [INFO][4872] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" HandleID="k8s-pod-network.3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Workload="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.536 [INFO][4872] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" HandleID="k8s-pod-network.3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Workload="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-9-be0981937a", "pod":"csi-node-driver-rp5zk", "timestamp":"2025-12-16 02:09:37.536486435 +0000 UTC"}, Hostname:"ci-4547-0-0-9-be0981937a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.536 [INFO][4872] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.536 [INFO][4872] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.536 [INFO][4872] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-9-be0981937a' Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.549 [INFO][4872] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.555 [INFO][4872] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.562 [INFO][4872] ipam/ipam.go 511: Trying affinity for 192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.566 [INFO][4872] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.571 [INFO][4872] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.192/26 host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.571 [INFO][4872] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.192/26 handle="k8s-pod-network.3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.576 [INFO][4872] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1 Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.583 [INFO][4872] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.192/26 handle="k8s-pod-network.3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.591 [INFO][4872] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.200/26] block=192.168.115.192/26 handle="k8s-pod-network.3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.591 [INFO][4872] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.200/26] handle="k8s-pod-network.3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" host="ci-4547-0-0-9-be0981937a" Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.592 [INFO][4872] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 02:09:37.620233 containerd[1587]: 2025-12-16 02:09:37.592 [INFO][4872] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.200/26] IPv6=[] ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" HandleID="k8s-pod-network.3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Workload="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" Dec 16 02:09:37.621592 containerd[1587]: 2025-12-16 02:09:37.596 [INFO][4859] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Namespace="calico-system" Pod="csi-node-driver-rp5zk" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"919dd2b2-2bc2-4394-9fed-3f9f47f938e5", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"", Pod:"csi-node-driver-rp5zk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.115.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali23118d2e215", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:37.621592 containerd[1587]: 2025-12-16 02:09:37.596 [INFO][4859] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.200/32] ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Namespace="calico-system" Pod="csi-node-driver-rp5zk" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" Dec 16 02:09:37.621592 containerd[1587]: 2025-12-16 02:09:37.596 [INFO][4859] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali23118d2e215 ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Namespace="calico-system" Pod="csi-node-driver-rp5zk" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" Dec 16 02:09:37.621592 containerd[1587]: 2025-12-16 02:09:37.600 [INFO][4859] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Namespace="calico-system" Pod="csi-node-driver-rp5zk" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" Dec 16 02:09:37.621592 containerd[1587]: 2025-12-16 02:09:37.601 [INFO][4859] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Namespace="calico-system" Pod="csi-node-driver-rp5zk" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"919dd2b2-2bc2-4394-9fed-3f9f47f938e5", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 2, 9, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-9-be0981937a", ContainerID:"3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1", Pod:"csi-node-driver-rp5zk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.115.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali23118d2e215", MAC:"5a:36:d1:cb:b7:b8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 02:09:37.621592 containerd[1587]: 2025-12-16 02:09:37.616 [INFO][4859] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" Namespace="calico-system" Pod="csi-node-driver-rp5zk" WorkloadEndpoint="ci--4547--0--0--9--be0981937a-k8s-csi--node--driver--rp5zk-eth0" Dec 16 02:09:37.635000 audit[4885]: NETFILTER_CFG table=filter:135 family=2 entries=60 op=nft_register_chain pid=4885 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 02:09:37.635000 audit[4885]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26704 a0=3 a1=ffffe2cab800 a2=0 a3=ffffa9016fa8 items=0 ppid=4293 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.635000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 02:09:37.646399 containerd[1587]: time="2025-12-16T02:09:37.646336766Z" level=info msg="connecting to shim 3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1" address="unix:///run/containerd/s/40363eafb2086c7a0917671b31dc80038984f528a2438c04b9dfe3add96ba2af" namespace=k8s.io protocol=ttrpc version=3 Dec 16 02:09:37.676553 systemd[1]: Started cri-containerd-3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1.scope - libcontainer container 3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1. Dec 16 02:09:37.693000 audit: BPF prog-id=251 op=LOAD Dec 16 02:09:37.693000 audit: BPF prog-id=252 op=LOAD Dec 16 02:09:37.693000 audit[4906]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4895 pid=4906 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361663738373132646535343637356339373762323535336165653631 Dec 16 02:09:37.694000 audit: BPF prog-id=252 op=UNLOAD Dec 16 02:09:37.694000 audit[4906]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4895 pid=4906 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.694000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361663738373132646535343637356339373762323535336165653631 Dec 16 02:09:37.694000 audit: BPF prog-id=253 op=LOAD Dec 16 02:09:37.694000 audit[4906]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4895 pid=4906 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.694000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361663738373132646535343637356339373762323535336165653631 Dec 16 02:09:37.694000 audit: BPF prog-id=254 op=LOAD Dec 16 02:09:37.694000 audit[4906]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4895 pid=4906 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.694000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361663738373132646535343637356339373762323535336165653631 Dec 16 02:09:37.694000 audit: BPF prog-id=254 op=UNLOAD Dec 16 02:09:37.694000 audit[4906]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4895 pid=4906 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.694000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361663738373132646535343637356339373762323535336165653631 Dec 16 02:09:37.694000 audit: BPF prog-id=253 op=UNLOAD Dec 16 02:09:37.694000 audit[4906]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4895 pid=4906 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.694000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361663738373132646535343637356339373762323535336165653631 Dec 16 02:09:37.694000 audit: BPF prog-id=255 op=LOAD Dec 16 02:09:37.694000 audit[4906]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4895 pid=4906 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.694000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361663738373132646535343637356339373762323535336165653631 Dec 16 02:09:37.719005 containerd[1587]: time="2025-12-16T02:09:37.718576468Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rp5zk,Uid:919dd2b2-2bc2-4394-9fed-3f9f47f938e5,Namespace:calico-system,Attempt:0,} returns sandbox id \"3af78712de54675c977b2553aee61b0d10a9a6066ee1fc5e6f46beb0e4a838b1\"" Dec 16 02:09:37.724194 containerd[1587]: time="2025-12-16T02:09:37.724104535Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 02:09:37.737349 kubelet[2826]: E1216 02:09:37.736765 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:09:37.787723 kubelet[2826]: I1216 02:09:37.786674 2826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-572qq" podStartSLOduration=46.786656581 podStartE2EDuration="46.786656581s" podCreationTimestamp="2025-12-16 02:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 02:09:37.754245221 +0000 UTC m=+52.471510223" watchObservedRunningTime="2025-12-16 02:09:37.786656581 +0000 UTC m=+52.503921583" Dec 16 02:09:37.831293 kubelet[2826]: I1216 02:09:37.831188 2826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-9kfwg" podStartSLOduration=46.831167912 podStartE2EDuration="46.831167912s" podCreationTimestamp="2025-12-16 02:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 02:09:37.830453553 +0000 UTC m=+52.547718555" watchObservedRunningTime="2025-12-16 02:09:37.831167912 +0000 UTC m=+52.548432914" Dec 16 02:09:37.902000 audit[4933]: NETFILTER_CFG table=filter:136 family=2 entries=17 op=nft_register_rule pid=4933 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:37.902000 audit[4933]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff9d77f90 a2=0 a3=1 items=0 ppid=2986 pid=4933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.902000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:37.917000 audit[4933]: NETFILTER_CFG table=nat:137 family=2 entries=47 op=nft_register_chain pid=4933 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:09:37.917000 audit[4933]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=fffff9d77f90 a2=0 a3=1 items=0 ppid=2986 pid=4933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:09:37.917000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:09:38.072345 containerd[1587]: time="2025-12-16T02:09:38.072230568Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:38.073706 containerd[1587]: time="2025-12-16T02:09:38.073613128Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 02:09:38.073832 containerd[1587]: time="2025-12-16T02:09:38.073764568Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:38.075114 kubelet[2826]: E1216 02:09:38.073980 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:09:38.075114 kubelet[2826]: E1216 02:09:38.074083 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:09:38.075114 kubelet[2826]: E1216 02:09:38.074223 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:38.078188 containerd[1587]: time="2025-12-16T02:09:38.077485088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 02:09:38.175210 systemd-networkd[1480]: cali15a42eafa1c: Gained IPv6LL Dec 16 02:09:38.418552 containerd[1587]: time="2025-12-16T02:09:38.418337012Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:38.420163 containerd[1587]: time="2025-12-16T02:09:38.420077612Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 02:09:38.420739 containerd[1587]: time="2025-12-16T02:09:38.420233452Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:38.420870 kubelet[2826]: E1216 02:09:38.420438 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:09:38.420870 kubelet[2826]: E1216 02:09:38.420488 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:09:38.420870 kubelet[2826]: E1216 02:09:38.420601 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:38.420870 kubelet[2826]: E1216 02:09:38.420651 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:38.741718 kubelet[2826]: E1216 02:09:38.741607 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:38.752232 systemd-networkd[1480]: caliea3ef94003c: Gained IPv6LL Dec 16 02:09:39.647474 systemd-networkd[1480]: cali23118d2e215: Gained IPv6LL Dec 16 02:09:39.743493 kubelet[2826]: E1216 02:09:39.743443 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:44.451040 containerd[1587]: time="2025-12-16T02:09:44.450983553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 02:09:44.799793 containerd[1587]: time="2025-12-16T02:09:44.799487924Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:44.801201 containerd[1587]: time="2025-12-16T02:09:44.801121385Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 02:09:44.801387 containerd[1587]: time="2025-12-16T02:09:44.801268547Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:44.802080 kubelet[2826]: E1216 02:09:44.801581 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:09:44.802080 kubelet[2826]: E1216 02:09:44.801632 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:09:44.802080 kubelet[2826]: E1216 02:09:44.801718 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-b648c4bbd-x7xsf_calico-system(76743683-d50e-4bc9-aceb-a84e73d5c7be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:44.804711 containerd[1587]: time="2025-12-16T02:09:44.804648269Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 02:09:45.155924 containerd[1587]: time="2025-12-16T02:09:45.155675805Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:45.157692 containerd[1587]: time="2025-12-16T02:09:45.157572712Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 02:09:45.157956 containerd[1587]: time="2025-12-16T02:09:45.157627953Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:45.158044 kubelet[2826]: E1216 02:09:45.157805 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 02:09:45.158044 kubelet[2826]: E1216 02:09:45.157852 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 02:09:45.158044 kubelet[2826]: E1216 02:09:45.157926 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-b648c4bbd-x7xsf_calico-system(76743683-d50e-4bc9-aceb-a84e73d5c7be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:45.158044 kubelet[2826]: E1216 02:09:45.157969 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:09:46.450743 containerd[1587]: time="2025-12-16T02:09:46.450682107Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 02:09:46.803658 containerd[1587]: time="2025-12-16T02:09:46.803414319Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:46.805202 containerd[1587]: time="2025-12-16T02:09:46.804998145Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 02:09:46.805374 containerd[1587]: time="2025-12-16T02:09:46.805232629Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:46.805743 kubelet[2826]: E1216 02:09:46.805663 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:09:46.805743 kubelet[2826]: E1216 02:09:46.805730 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:09:46.806296 kubelet[2826]: E1216 02:09:46.805847 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-86cf67c95b-68pm7_calico-apiserver(4f109e8a-b6cd-4daa-a636-a987203ce9dc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:46.806296 kubelet[2826]: E1216 02:09:46.805903 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:09:50.451124 containerd[1587]: time="2025-12-16T02:09:50.450908178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 02:09:50.788562 containerd[1587]: time="2025-12-16T02:09:50.788294097Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:50.790163 containerd[1587]: time="2025-12-16T02:09:50.790043457Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 02:09:50.790505 containerd[1587]: time="2025-12-16T02:09:50.790097698Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:50.790609 kubelet[2826]: E1216 02:09:50.790299 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:09:50.790609 kubelet[2826]: E1216 02:09:50.790343 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:09:50.793251 kubelet[2826]: E1216 02:09:50.790888 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-86cf67c95b-xtmz8_calico-apiserver(24ee6e1b-64f9-47f5-86c3-17009d2e74c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:50.793251 kubelet[2826]: E1216 02:09:50.791237 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:09:50.793308 containerd[1587]: time="2025-12-16T02:09:50.790725072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 02:09:51.132052 containerd[1587]: time="2025-12-16T02:09:51.131713558Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:51.133653 containerd[1587]: time="2025-12-16T02:09:51.133560963Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 02:09:51.134016 containerd[1587]: time="2025-12-16T02:09:51.133615964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:51.134258 kubelet[2826]: E1216 02:09:51.134190 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 02:09:51.134490 kubelet[2826]: E1216 02:09:51.134326 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 02:09:51.134490 kubelet[2826]: E1216 02:09:51.134419 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-75666888-t2jlw_calico-system(a4f3ee57-42ce-4008-b96e-85199f6fd632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:51.134490 kubelet[2826]: E1216 02:09:51.134453 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:09:52.451576 containerd[1587]: time="2025-12-16T02:09:52.451436279Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 02:09:52.788601 containerd[1587]: time="2025-12-16T02:09:52.788324579Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:52.790307 containerd[1587]: time="2025-12-16T02:09:52.790243789Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 02:09:52.790513 containerd[1587]: time="2025-12-16T02:09:52.790366392Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:52.790807 kubelet[2826]: E1216 02:09:52.790681 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:09:52.790807 kubelet[2826]: E1216 02:09:52.790758 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:09:52.791558 kubelet[2826]: E1216 02:09:52.791199 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-cl8m7_calico-system(e4bfcf46-398e-437f-b7f1-81589479eeb2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:52.791558 kubelet[2826]: E1216 02:09:52.791244 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:09:54.451773 containerd[1587]: time="2025-12-16T02:09:54.451383435Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 02:09:54.785164 containerd[1587]: time="2025-12-16T02:09:54.784974808Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:54.787341 containerd[1587]: time="2025-12-16T02:09:54.787262994Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 02:09:54.787469 containerd[1587]: time="2025-12-16T02:09:54.787371037Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:54.787987 kubelet[2826]: E1216 02:09:54.787608 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:09:54.788814 kubelet[2826]: E1216 02:09:54.788470 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:09:54.788814 kubelet[2826]: E1216 02:09:54.788570 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:54.789943 containerd[1587]: time="2025-12-16T02:09:54.789896390Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 02:09:55.126168 containerd[1587]: time="2025-12-16T02:09:55.125970686Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:09:55.127970 containerd[1587]: time="2025-12-16T02:09:55.127827942Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 02:09:55.128215 containerd[1587]: time="2025-12-16T02:09:55.127866503Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 02:09:55.128932 kubelet[2826]: E1216 02:09:55.128844 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:09:55.128932 kubelet[2826]: E1216 02:09:55.128905 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:09:55.129079 kubelet[2826]: E1216 02:09:55.129000 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 02:09:55.129109 kubelet[2826]: E1216 02:09:55.129065 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:09:59.450998 kubelet[2826]: E1216 02:09:59.450916 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:10:00.454298 kubelet[2826]: E1216 02:10:00.453258 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:10:04.453064 kubelet[2826]: E1216 02:10:04.452563 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:10:05.454015 kubelet[2826]: E1216 02:10:05.453916 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:10:06.451239 kubelet[2826]: E1216 02:10:06.451161 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:10:06.453954 kubelet[2826]: E1216 02:10:06.453895 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:10:11.453577 containerd[1587]: time="2025-12-16T02:10:11.453273671Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 02:10:11.798334 containerd[1587]: time="2025-12-16T02:10:11.797757343Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:11.800201 containerd[1587]: time="2025-12-16T02:10:11.800043411Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 02:10:11.800201 containerd[1587]: time="2025-12-16T02:10:11.800139856Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:11.800404 kubelet[2826]: E1216 02:10:11.800317 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:10:11.800404 kubelet[2826]: E1216 02:10:11.800366 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:10:11.802433 kubelet[2826]: E1216 02:10:11.800566 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-86cf67c95b-68pm7_calico-apiserver(4f109e8a-b6cd-4daa-a636-a987203ce9dc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:11.802433 kubelet[2826]: E1216 02:10:11.800604 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:10:11.803044 containerd[1587]: time="2025-12-16T02:10:11.802715297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 02:10:12.137488 containerd[1587]: time="2025-12-16T02:10:12.136903271Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:12.139339 containerd[1587]: time="2025-12-16T02:10:12.139261624Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 02:10:12.139615 containerd[1587]: time="2025-12-16T02:10:12.139461714Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:12.140048 kubelet[2826]: E1216 02:10:12.139841 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:10:12.140048 kubelet[2826]: E1216 02:10:12.139889 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:10:12.140048 kubelet[2826]: E1216 02:10:12.139972 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-b648c4bbd-x7xsf_calico-system(76743683-d50e-4bc9-aceb-a84e73d5c7be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:12.141736 containerd[1587]: time="2025-12-16T02:10:12.141700782Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 02:10:12.493706 containerd[1587]: time="2025-12-16T02:10:12.493643487Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:12.495475 containerd[1587]: time="2025-12-16T02:10:12.495400451Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 02:10:12.495711 containerd[1587]: time="2025-12-16T02:10:12.495444333Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:12.496010 kubelet[2826]: E1216 02:10:12.495971 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 02:10:12.496229 kubelet[2826]: E1216 02:10:12.496034 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 02:10:12.496229 kubelet[2826]: E1216 02:10:12.496114 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-b648c4bbd-x7xsf_calico-system(76743683-d50e-4bc9-aceb-a84e73d5c7be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:12.496229 kubelet[2826]: E1216 02:10:12.496160 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:10:16.451276 containerd[1587]: time="2025-12-16T02:10:16.451236385Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 02:10:16.801253 containerd[1587]: time="2025-12-16T02:10:16.801100464Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:16.803937 containerd[1587]: time="2025-12-16T02:10:16.803757719Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 02:10:16.803937 containerd[1587]: time="2025-12-16T02:10:16.803854484Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:16.804614 kubelet[2826]: E1216 02:10:16.804191 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 02:10:16.804614 kubelet[2826]: E1216 02:10:16.804242 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 02:10:16.804614 kubelet[2826]: E1216 02:10:16.804340 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-75666888-t2jlw_calico-system(a4f3ee57-42ce-4008-b96e-85199f6fd632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:16.804614 kubelet[2826]: E1216 02:10:16.804371 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:10:17.450788 containerd[1587]: time="2025-12-16T02:10:17.450730614Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 02:10:17.786992 containerd[1587]: time="2025-12-16T02:10:17.786590928Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:17.788694 containerd[1587]: time="2025-12-16T02:10:17.788518908Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 02:10:17.788694 containerd[1587]: time="2025-12-16T02:10:17.788633714Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:17.788936 kubelet[2826]: E1216 02:10:17.788806 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:10:17.788936 kubelet[2826]: E1216 02:10:17.788850 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:10:17.788936 kubelet[2826]: E1216 02:10:17.788920 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:17.790669 containerd[1587]: time="2025-12-16T02:10:17.790643418Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 02:10:18.145018 containerd[1587]: time="2025-12-16T02:10:18.144412272Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:18.146410 containerd[1587]: time="2025-12-16T02:10:18.146248848Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 02:10:18.146410 containerd[1587]: time="2025-12-16T02:10:18.146355853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:18.146609 kubelet[2826]: E1216 02:10:18.146569 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:10:18.146869 kubelet[2826]: E1216 02:10:18.146613 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:10:18.146869 kubelet[2826]: E1216 02:10:18.146678 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:18.146869 kubelet[2826]: E1216 02:10:18.146717 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:10:18.449604 containerd[1587]: time="2025-12-16T02:10:18.449497397Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 02:10:18.790205 containerd[1587]: time="2025-12-16T02:10:18.789744442Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:18.791139 containerd[1587]: time="2025-12-16T02:10:18.791096353Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 02:10:18.791330 containerd[1587]: time="2025-12-16T02:10:18.791194438Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:18.791581 kubelet[2826]: E1216 02:10:18.791538 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:10:18.791848 kubelet[2826]: E1216 02:10:18.791744 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:10:18.792044 kubelet[2826]: E1216 02:10:18.791960 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-cl8m7_calico-system(e4bfcf46-398e-437f-b7f1-81589479eeb2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:18.792135 kubelet[2826]: E1216 02:10:18.792015 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:10:19.450934 containerd[1587]: time="2025-12-16T02:10:19.450791443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 02:10:19.791314 containerd[1587]: time="2025-12-16T02:10:19.790946341Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:19.792579 containerd[1587]: time="2025-12-16T02:10:19.792436100Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 02:10:19.792579 containerd[1587]: time="2025-12-16T02:10:19.792492023Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:19.793615 kubelet[2826]: E1216 02:10:19.793564 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:10:19.793966 kubelet[2826]: E1216 02:10:19.793625 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:10:19.793966 kubelet[2826]: E1216 02:10:19.793726 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-86cf67c95b-xtmz8_calico-apiserver(24ee6e1b-64f9-47f5-86c3-17009d2e74c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:19.793966 kubelet[2826]: E1216 02:10:19.793762 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:10:23.456514 kubelet[2826]: E1216 02:10:23.456453 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:10:24.449254 kubelet[2826]: E1216 02:10:24.449200 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:10:31.451533 kubelet[2826]: E1216 02:10:31.450882 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:10:31.451533 kubelet[2826]: E1216 02:10:31.451087 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:10:31.452977 kubelet[2826]: E1216 02:10:31.452833 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:10:34.451051 kubelet[2826]: E1216 02:10:34.450736 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:10:35.458863 kubelet[2826]: E1216 02:10:35.458773 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:10:36.451255 kubelet[2826]: E1216 02:10:36.450634 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:10:42.450845 kubelet[2826]: E1216 02:10:42.450700 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:10:45.451646 kubelet[2826]: E1216 02:10:45.451425 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:10:46.451067 kubelet[2826]: E1216 02:10:46.450786 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:10:46.452424 kubelet[2826]: E1216 02:10:46.452380 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:10:48.451352 kubelet[2826]: E1216 02:10:48.451283 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:10:49.449805 kubelet[2826]: E1216 02:10:49.449730 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:10:57.450643 containerd[1587]: time="2025-12-16T02:10:57.450540734Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 02:10:57.451554 kubelet[2826]: E1216 02:10:57.450851 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:10:57.785358 containerd[1587]: time="2025-12-16T02:10:57.784774164Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:57.786270 containerd[1587]: time="2025-12-16T02:10:57.786093532Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 02:10:57.786270 containerd[1587]: time="2025-12-16T02:10:57.786187178Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:57.786788 kubelet[2826]: E1216 02:10:57.786735 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 02:10:57.786788 kubelet[2826]: E1216 02:10:57.786787 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 02:10:57.786920 kubelet[2826]: E1216 02:10:57.786859 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-75666888-t2jlw_calico-system(a4f3ee57-42ce-4008-b96e-85199f6fd632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:57.786920 kubelet[2826]: E1216 02:10:57.786892 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:10:59.452327 containerd[1587]: time="2025-12-16T02:10:59.450123796Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 02:10:59.795822 containerd[1587]: time="2025-12-16T02:10:59.795542781Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:10:59.797164 containerd[1587]: time="2025-12-16T02:10:59.797094125Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 02:10:59.797322 containerd[1587]: time="2025-12-16T02:10:59.797256336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 02:10:59.797566 kubelet[2826]: E1216 02:10:59.797517 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:10:59.797878 kubelet[2826]: E1216 02:10:59.797606 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:10:59.797907 kubelet[2826]: E1216 02:10:59.797871 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-cl8m7_calico-system(e4bfcf46-398e-437f-b7f1-81589479eeb2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 02:10:59.798479 kubelet[2826]: E1216 02:10:59.798440 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:11:00.451687 containerd[1587]: time="2025-12-16T02:11:00.451401236Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 02:11:00.786957 containerd[1587]: time="2025-12-16T02:11:00.786732281Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:11:00.788471 containerd[1587]: time="2025-12-16T02:11:00.788388272Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 02:11:00.788566 containerd[1587]: time="2025-12-16T02:11:00.788522361Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 02:11:00.788935 kubelet[2826]: E1216 02:11:00.788886 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:11:00.789104 kubelet[2826]: E1216 02:11:00.789083 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:11:00.789518 kubelet[2826]: E1216 02:11:00.789330 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 02:11:00.790000 containerd[1587]: time="2025-12-16T02:11:00.789700201Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 02:11:01.146216 containerd[1587]: time="2025-12-16T02:11:01.146083289Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:11:01.147890 containerd[1587]: time="2025-12-16T02:11:01.147777964Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 02:11:01.148112 containerd[1587]: time="2025-12-16T02:11:01.147852369Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 02:11:01.148143 kubelet[2826]: E1216 02:11:01.148082 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:11:01.148404 kubelet[2826]: E1216 02:11:01.148168 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:11:01.148404 kubelet[2826]: E1216 02:11:01.148348 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-b648c4bbd-x7xsf_calico-system(76743683-d50e-4bc9-aceb-a84e73d5c7be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 02:11:01.149307 containerd[1587]: time="2025-12-16T02:11:01.148793432Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 02:11:01.498907 containerd[1587]: time="2025-12-16T02:11:01.498643595Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:11:01.500554 containerd[1587]: time="2025-12-16T02:11:01.500385952Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 02:11:01.500554 containerd[1587]: time="2025-12-16T02:11:01.500493440Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 02:11:01.500752 kubelet[2826]: E1216 02:11:01.500672 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:11:01.500752 kubelet[2826]: E1216 02:11:01.500717 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:11:01.500966 kubelet[2826]: E1216 02:11:01.500924 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 02:11:01.501060 kubelet[2826]: E1216 02:11:01.500976 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:11:01.502694 containerd[1587]: time="2025-12-16T02:11:01.502629704Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 02:11:01.843490 containerd[1587]: time="2025-12-16T02:11:01.842704486Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:11:01.844977 containerd[1587]: time="2025-12-16T02:11:01.844775586Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 02:11:01.844977 containerd[1587]: time="2025-12-16T02:11:01.844881113Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 02:11:01.845489 kubelet[2826]: E1216 02:11:01.845428 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 02:11:01.845489 kubelet[2826]: E1216 02:11:01.845491 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 02:11:01.845648 kubelet[2826]: E1216 02:11:01.845576 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-b648c4bbd-x7xsf_calico-system(76743683-d50e-4bc9-aceb-a84e73d5c7be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 02:11:01.845648 kubelet[2826]: E1216 02:11:01.845620 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:11:02.450037 containerd[1587]: time="2025-12-16T02:11:02.449873550Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 02:11:02.785525 containerd[1587]: time="2025-12-16T02:11:02.785390199Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:11:02.786888 containerd[1587]: time="2025-12-16T02:11:02.786759452Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 02:11:02.786888 containerd[1587]: time="2025-12-16T02:11:02.786821296Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 02:11:02.787267 kubelet[2826]: E1216 02:11:02.787224 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:11:02.788133 kubelet[2826]: E1216 02:11:02.787587 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:11:02.788133 kubelet[2826]: E1216 02:11:02.787679 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-86cf67c95b-68pm7_calico-apiserver(4f109e8a-b6cd-4daa-a636-a987203ce9dc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 02:11:02.788133 kubelet[2826]: E1216 02:11:02.787710 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:11:10.388867 update_engine[1565]: I20251216 02:11:10.388756 1565 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Dec 16 02:11:10.391087 update_engine[1565]: I20251216 02:11:10.389593 1565 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Dec 16 02:11:10.391087 update_engine[1565]: I20251216 02:11:10.390213 1565 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Dec 16 02:11:10.393349 update_engine[1565]: I20251216 02:11:10.393177 1565 omaha_request_params.cc:62] Current group set to alpha Dec 16 02:11:10.394828 update_engine[1565]: I20251216 02:11:10.394578 1565 update_attempter.cc:499] Already updated boot flags. Skipping. Dec 16 02:11:10.395200 update_engine[1565]: I20251216 02:11:10.394716 1565 update_attempter.cc:643] Scheduling an action processor start. Dec 16 02:11:10.395200 update_engine[1565]: I20251216 02:11:10.395087 1565 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 16 02:11:10.398849 update_engine[1565]: I20251216 02:11:10.398815 1565 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Dec 16 02:11:10.399021 update_engine[1565]: I20251216 02:11:10.399005 1565 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 16 02:11:10.400087 update_engine[1565]: I20251216 02:11:10.399082 1565 omaha_request_action.cc:272] Request: Dec 16 02:11:10.400087 update_engine[1565]: Dec 16 02:11:10.400087 update_engine[1565]: Dec 16 02:11:10.400087 update_engine[1565]: Dec 16 02:11:10.400087 update_engine[1565]: Dec 16 02:11:10.400087 update_engine[1565]: Dec 16 02:11:10.400087 update_engine[1565]: Dec 16 02:11:10.400087 update_engine[1565]: Dec 16 02:11:10.400087 update_engine[1565]: Dec 16 02:11:10.400087 update_engine[1565]: I20251216 02:11:10.399096 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 02:11:10.402391 update_engine[1565]: I20251216 02:11:10.402071 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 02:11:10.403290 update_engine[1565]: I20251216 02:11:10.403262 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 02:11:10.403723 update_engine[1565]: E20251216 02:11:10.403700 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 02:11:10.403840 update_engine[1565]: I20251216 02:11:10.403823 1565 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Dec 16 02:11:10.406046 locksmithd[1612]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Dec 16 02:11:10.451974 containerd[1587]: time="2025-12-16T02:11:10.451875778Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 02:11:10.794081 containerd[1587]: time="2025-12-16T02:11:10.793994583Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:11:10.795948 containerd[1587]: time="2025-12-16T02:11:10.795905394Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 02:11:10.796051 containerd[1587]: time="2025-12-16T02:11:10.796000441Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 02:11:10.796275 kubelet[2826]: E1216 02:11:10.796233 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:11:10.796627 kubelet[2826]: E1216 02:11:10.796284 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 02:11:10.798052 kubelet[2826]: E1216 02:11:10.797066 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-86cf67c95b-xtmz8_calico-apiserver(24ee6e1b-64f9-47f5-86c3-17009d2e74c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 02:11:10.798052 kubelet[2826]: E1216 02:11:10.797129 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:11:11.450695 kubelet[2826]: E1216 02:11:11.450178 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:11:13.450624 kubelet[2826]: E1216 02:11:13.449886 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:11:14.448922 kubelet[2826]: E1216 02:11:14.448839 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:11:15.453658 kubelet[2826]: E1216 02:11:15.453590 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:11:16.452632 kubelet[2826]: E1216 02:11:16.452569 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:11:20.095069 kernel: kauditd_printk_skb: 431 callbacks suppressed Dec 16 02:11:20.095195 kernel: audit: type=1130 audit(1765851080.091:731): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-49.13.61.135:22-139.178.68.195:42274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:20.091000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-49.13.61.135:22-139.178.68.195:42274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:20.092206 systemd[1]: Started sshd@7-49.13.61.135:22-139.178.68.195:42274.service - OpenSSH per-connection server daemon (139.178.68.195:42274). Dec 16 02:11:20.305160 update_engine[1565]: I20251216 02:11:20.305083 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 02:11:20.305501 update_engine[1565]: I20251216 02:11:20.305177 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 02:11:20.305930 update_engine[1565]: I20251216 02:11:20.305567 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 02:11:20.306142 update_engine[1565]: E20251216 02:11:20.306093 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 02:11:20.306197 update_engine[1565]: I20251216 02:11:20.306176 1565 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Dec 16 02:11:20.942000 audit[5096]: USER_ACCT pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:20.946320 sshd[5096]: Accepted publickey for core from 139.178.68.195 port 42274 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:20.947096 kernel: audit: type=1101 audit(1765851080.942:732): pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:20.947000 audit[5096]: CRED_ACQ pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:20.951679 sshd-session[5096]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:20.953952 kernel: audit: type=1103 audit(1765851080.947:733): pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:20.954341 kernel: audit: type=1006 audit(1765851080.947:734): pid=5096 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Dec 16 02:11:20.957353 kernel: audit: type=1300 audit(1765851080.947:734): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeb9627b0 a2=3 a3=0 items=0 ppid=1 pid=5096 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:20.947000 audit[5096]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeb9627b0 a2=3 a3=0 items=0 ppid=1 pid=5096 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:20.961131 kernel: audit: type=1327 audit(1765851080.947:734): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:20.947000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:20.963097 systemd-logind[1564]: New session 9 of user core. Dec 16 02:11:20.974596 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 02:11:20.978000 audit[5096]: USER_START pid=5096 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:20.987154 kernel: audit: type=1105 audit(1765851080.978:735): pid=5096 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:20.987270 kernel: audit: type=1103 audit(1765851080.985:736): pid=5100 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:20.985000 audit[5100]: CRED_ACQ pid=5100 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:21.529629 sshd[5100]: Connection closed by 139.178.68.195 port 42274 Dec 16 02:11:21.530263 sshd-session[5096]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:21.532000 audit[5096]: USER_END pid=5096 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:21.532000 audit[5096]: CRED_DISP pid=5096 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:21.537945 kernel: audit: type=1106 audit(1765851081.532:737): pid=5096 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:21.538044 kernel: audit: type=1104 audit(1765851081.532:738): pid=5096 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:21.538827 systemd[1]: sshd@7-49.13.61.135:22-139.178.68.195:42274.service: Deactivated successfully. Dec 16 02:11:21.540000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-49.13.61.135:22-139.178.68.195:42274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:21.543879 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 02:11:21.546907 systemd-logind[1564]: Session 9 logged out. Waiting for processes to exit. Dec 16 02:11:21.548682 systemd-logind[1564]: Removed session 9. Dec 16 02:11:24.451524 kubelet[2826]: E1216 02:11:24.451448 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:11:24.452477 kubelet[2826]: E1216 02:11:24.452418 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:11:26.450041 kubelet[2826]: E1216 02:11:26.449952 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:11:26.700724 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 02:11:26.700834 kernel: audit: type=1130 audit(1765851086.696:740): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-49.13.61.135:22-139.178.68.195:50622 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:26.696000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-49.13.61.135:22-139.178.68.195:50622 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:26.697790 systemd[1]: Started sshd@8-49.13.61.135:22-139.178.68.195:50622.service - OpenSSH per-connection server daemon (139.178.68.195:50622). Dec 16 02:11:27.547000 audit[5117]: USER_ACCT pid=5117 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:27.549331 sshd[5117]: Accepted publickey for core from 139.178.68.195 port 50622 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:27.554710 kernel: audit: type=1101 audit(1765851087.547:741): pid=5117 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:27.554831 kernel: audit: type=1103 audit(1765851087.552:742): pid=5117 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:27.552000 audit[5117]: CRED_ACQ pid=5117 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:27.553998 sshd-session[5117]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:27.556503 kernel: audit: type=1006 audit(1765851087.552:743): pid=5117 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Dec 16 02:11:27.557114 kernel: audit: type=1300 audit(1765851087.552:743): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeac25d90 a2=3 a3=0 items=0 ppid=1 pid=5117 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:27.552000 audit[5117]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeac25d90 a2=3 a3=0 items=0 ppid=1 pid=5117 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:27.552000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:27.560451 kernel: audit: type=1327 audit(1765851087.552:743): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:27.565160 systemd-logind[1564]: New session 10 of user core. Dec 16 02:11:27.568316 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 02:11:27.572000 audit[5117]: USER_START pid=5117 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:27.576000 audit[5121]: CRED_ACQ pid=5121 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:27.581098 kernel: audit: type=1105 audit(1765851087.572:744): pid=5117 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:27.581217 kernel: audit: type=1103 audit(1765851087.576:745): pid=5121 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:28.120447 sshd[5121]: Connection closed by 139.178.68.195 port 50622 Dec 16 02:11:28.121327 sshd-session[5117]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:28.122000 audit[5117]: USER_END pid=5117 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:28.128906 kernel: audit: type=1106 audit(1765851088.122:746): pid=5117 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:28.129021 kernel: audit: type=1104 audit(1765851088.122:747): pid=5117 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:28.122000 audit[5117]: CRED_DISP pid=5117 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:28.128732 systemd[1]: sshd@8-49.13.61.135:22-139.178.68.195:50622.service: Deactivated successfully. Dec 16 02:11:28.130000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-49.13.61.135:22-139.178.68.195:50622 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:28.133879 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 02:11:28.137857 systemd-logind[1564]: Session 10 logged out. Waiting for processes to exit. Dec 16 02:11:28.142263 systemd-logind[1564]: Removed session 10. Dec 16 02:11:28.451937 kubelet[2826]: E1216 02:11:28.451834 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:11:28.451937 kubelet[2826]: E1216 02:11:28.451890 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:11:29.450805 kubelet[2826]: E1216 02:11:29.450341 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:11:30.298568 update_engine[1565]: I20251216 02:11:30.298066 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 02:11:30.298568 update_engine[1565]: I20251216 02:11:30.298161 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 02:11:30.298568 update_engine[1565]: I20251216 02:11:30.298523 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 02:11:30.299250 update_engine[1565]: E20251216 02:11:30.299180 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 02:11:30.301153 update_engine[1565]: I20251216 02:11:30.301113 1565 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Dec 16 02:11:33.290057 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 02:11:33.290200 kernel: audit: type=1130 audit(1765851093.285:749): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-49.13.61.135:22-139.178.68.195:48728 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:33.285000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-49.13.61.135:22-139.178.68.195:48728 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:33.286464 systemd[1]: Started sshd@9-49.13.61.135:22-139.178.68.195:48728.service - OpenSSH per-connection server daemon (139.178.68.195:48728). Dec 16 02:11:34.123000 audit[5158]: USER_ACCT pid=5158 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.127806 sshd[5158]: Accepted publickey for core from 139.178.68.195 port 48728 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:34.129089 kernel: audit: type=1101 audit(1765851094.123:750): pid=5158 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.129000 audit[5158]: CRED_ACQ pid=5158 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.134573 sshd-session[5158]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:34.136741 kernel: audit: type=1103 audit(1765851094.129:751): pid=5158 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.136858 kernel: audit: type=1006 audit(1765851094.129:752): pid=5158 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Dec 16 02:11:34.139846 kernel: audit: type=1300 audit(1765851094.129:752): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffde9539b0 a2=3 a3=0 items=0 ppid=1 pid=5158 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:34.129000 audit[5158]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffde9539b0 a2=3 a3=0 items=0 ppid=1 pid=5158 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:34.129000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:34.144166 kernel: audit: type=1327 audit(1765851094.129:752): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:34.149185 systemd-logind[1564]: New session 11 of user core. Dec 16 02:11:34.156320 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 02:11:34.161000 audit[5158]: USER_START pid=5158 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.171052 kernel: audit: type=1105 audit(1765851094.161:753): pid=5158 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.171178 kernel: audit: type=1103 audit(1765851094.169:754): pid=5162 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.169000 audit[5162]: CRED_ACQ pid=5162 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.708005 sshd[5162]: Connection closed by 139.178.68.195 port 48728 Dec 16 02:11:34.709196 sshd-session[5158]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:34.711000 audit[5158]: USER_END pid=5158 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.717970 kernel: audit: type=1106 audit(1765851094.711:755): pid=5158 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.718171 kernel: audit: type=1104 audit(1765851094.711:756): pid=5158 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.711000 audit[5158]: CRED_DISP pid=5158 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:34.716383 systemd[1]: sshd@9-49.13.61.135:22-139.178.68.195:48728.service: Deactivated successfully. Dec 16 02:11:34.716000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-49.13.61.135:22-139.178.68.195:48728 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:34.720723 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 02:11:34.727798 systemd-logind[1564]: Session 11 logged out. Waiting for processes to exit. Dec 16 02:11:34.729151 systemd-logind[1564]: Removed session 11. Dec 16 02:11:34.876000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-49.13.61.135:22-139.178.68.195:48738 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:34.877148 systemd[1]: Started sshd@10-49.13.61.135:22-139.178.68.195:48738.service - OpenSSH per-connection server daemon (139.178.68.195:48738). Dec 16 02:11:35.725000 audit[5176]: USER_ACCT pid=5176 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:35.727362 sshd[5176]: Accepted publickey for core from 139.178.68.195 port 48738 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:35.728000 audit[5176]: CRED_ACQ pid=5176 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:35.728000 audit[5176]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff5845190 a2=3 a3=0 items=0 ppid=1 pid=5176 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:35.728000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:35.730063 sshd-session[5176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:35.737786 systemd-logind[1564]: New session 12 of user core. Dec 16 02:11:35.743411 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 02:11:35.748000 audit[5176]: USER_START pid=5176 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:35.750000 audit[5180]: CRED_ACQ pid=5180 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:36.418488 sshd[5180]: Connection closed by 139.178.68.195 port 48738 Dec 16 02:11:36.421229 sshd-session[5176]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:36.422000 audit[5176]: USER_END pid=5176 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:36.422000 audit[5176]: CRED_DISP pid=5176 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:36.426792 systemd[1]: sshd@10-49.13.61.135:22-139.178.68.195:48738.service: Deactivated successfully. Dec 16 02:11:36.426000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-49.13.61.135:22-139.178.68.195:48738 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:36.430017 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 02:11:36.434289 systemd-logind[1564]: Session 12 logged out. Waiting for processes to exit. Dec 16 02:11:36.438890 systemd-logind[1564]: Removed session 12. Dec 16 02:11:36.588000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-49.13.61.135:22-139.178.68.195:48748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:36.589336 systemd[1]: Started sshd@11-49.13.61.135:22-139.178.68.195:48748.service - OpenSSH per-connection server daemon (139.178.68.195:48748). Dec 16 02:11:37.430000 audit[5190]: USER_ACCT pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:37.431564 sshd[5190]: Accepted publickey for core from 139.178.68.195 port 48748 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:37.431000 audit[5190]: CRED_ACQ pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:37.431000 audit[5190]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcdc83b10 a2=3 a3=0 items=0 ppid=1 pid=5190 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:37.431000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:37.433802 sshd-session[5190]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:37.441309 systemd-logind[1564]: New session 13 of user core. Dec 16 02:11:37.445290 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 02:11:37.449000 audit[5190]: USER_START pid=5190 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:37.451198 kubelet[2826]: E1216 02:11:37.451042 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:11:37.453000 audit[5194]: CRED_ACQ pid=5194 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:38.076261 sshd[5194]: Connection closed by 139.178.68.195 port 48748 Dec 16 02:11:38.077248 sshd-session[5190]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:38.079000 audit[5190]: USER_END pid=5190 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:38.079000 audit[5190]: CRED_DISP pid=5190 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:38.086781 systemd[1]: sshd@11-49.13.61.135:22-139.178.68.195:48748.service: Deactivated successfully. Dec 16 02:11:38.087000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-49.13.61.135:22-139.178.68.195:48748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:38.087764 systemd-logind[1564]: Session 13 logged out. Waiting for processes to exit. Dec 16 02:11:38.094748 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 02:11:38.099687 systemd-logind[1564]: Removed session 13. Dec 16 02:11:38.449680 kubelet[2826]: E1216 02:11:38.449543 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:11:39.454519 kubelet[2826]: E1216 02:11:39.454456 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:11:40.299913 update_engine[1565]: I20251216 02:11:40.299078 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 02:11:40.299913 update_engine[1565]: I20251216 02:11:40.299206 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 02:11:40.299913 update_engine[1565]: I20251216 02:11:40.299813 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 02:11:40.302427 update_engine[1565]: E20251216 02:11:40.302372 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 02:11:40.302756 update_engine[1565]: I20251216 02:11:40.302675 1565 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 16 02:11:40.302756 update_engine[1565]: I20251216 02:11:40.302696 1565 omaha_request_action.cc:617] Omaha request response: Dec 16 02:11:40.302933 update_engine[1565]: E20251216 02:11:40.302881 1565 omaha_request_action.cc:636] Omaha request network transfer failed. Dec 16 02:11:40.303246 update_engine[1565]: I20251216 02:11:40.302999 1565 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Dec 16 02:11:40.303246 update_engine[1565]: I20251216 02:11:40.303011 1565 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 02:11:40.303246 update_engine[1565]: I20251216 02:11:40.303016 1565 update_attempter.cc:306] Processing Done. Dec 16 02:11:40.304428 update_engine[1565]: E20251216 02:11:40.303465 1565 update_attempter.cc:619] Update failed. Dec 16 02:11:40.304428 update_engine[1565]: I20251216 02:11:40.303495 1565 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Dec 16 02:11:40.304428 update_engine[1565]: I20251216 02:11:40.303501 1565 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Dec 16 02:11:40.304428 update_engine[1565]: I20251216 02:11:40.303506 1565 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Dec 16 02:11:40.304428 update_engine[1565]: I20251216 02:11:40.303580 1565 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 16 02:11:40.304428 update_engine[1565]: I20251216 02:11:40.303602 1565 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 16 02:11:40.304428 update_engine[1565]: I20251216 02:11:40.303608 1565 omaha_request_action.cc:272] Request: Dec 16 02:11:40.304428 update_engine[1565]: Dec 16 02:11:40.304428 update_engine[1565]: Dec 16 02:11:40.304428 update_engine[1565]: Dec 16 02:11:40.304428 update_engine[1565]: Dec 16 02:11:40.304428 update_engine[1565]: Dec 16 02:11:40.304428 update_engine[1565]: Dec 16 02:11:40.304428 update_engine[1565]: I20251216 02:11:40.303615 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 02:11:40.304428 update_engine[1565]: I20251216 02:11:40.303644 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 02:11:40.304428 update_engine[1565]: I20251216 02:11:40.303982 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 02:11:40.306167 locksmithd[1612]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Dec 16 02:11:40.307404 update_engine[1565]: E20251216 02:11:40.305728 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 02:11:40.307404 update_engine[1565]: I20251216 02:11:40.305799 1565 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 16 02:11:40.307404 update_engine[1565]: I20251216 02:11:40.305807 1565 omaha_request_action.cc:617] Omaha request response: Dec 16 02:11:40.307404 update_engine[1565]: I20251216 02:11:40.305817 1565 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 02:11:40.307404 update_engine[1565]: I20251216 02:11:40.305821 1565 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 02:11:40.307404 update_engine[1565]: I20251216 02:11:40.305825 1565 update_attempter.cc:306] Processing Done. Dec 16 02:11:40.307404 update_engine[1565]: I20251216 02:11:40.305830 1565 update_attempter.cc:310] Error event sent. Dec 16 02:11:40.307404 update_engine[1565]: I20251216 02:11:40.305840 1565 update_check_scheduler.cc:74] Next update check in 41m35s Dec 16 02:11:40.307778 locksmithd[1612]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Dec 16 02:11:41.454481 kubelet[2826]: E1216 02:11:41.454420 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:11:43.247000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-49.13.61.135:22-139.178.68.195:53822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:43.248410 systemd[1]: Started sshd@12-49.13.61.135:22-139.178.68.195:53822.service - OpenSSH per-connection server daemon (139.178.68.195:53822). Dec 16 02:11:43.249284 kernel: kauditd_printk_skb: 23 callbacks suppressed Dec 16 02:11:43.249340 kernel: audit: type=1130 audit(1765851103.247:776): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-49.13.61.135:22-139.178.68.195:53822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:43.453570 kubelet[2826]: E1216 02:11:43.453291 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:11:44.090000 audit[5211]: USER_ACCT pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.094207 sshd[5211]: Accepted publickey for core from 139.178.68.195 port 53822 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:44.095781 sshd-session[5211]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:44.093000 audit[5211]: CRED_ACQ pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.099286 kernel: audit: type=1101 audit(1765851104.090:777): pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.099380 kernel: audit: type=1103 audit(1765851104.093:778): pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.101448 kernel: audit: type=1006 audit(1765851104.093:779): pid=5211 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Dec 16 02:11:44.093000 audit[5211]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffec9cbe60 a2=3 a3=0 items=0 ppid=1 pid=5211 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:44.105771 kernel: audit: type=1300 audit(1765851104.093:779): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffec9cbe60 a2=3 a3=0 items=0 ppid=1 pid=5211 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:44.093000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:44.110166 kernel: audit: type=1327 audit(1765851104.093:779): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:44.113669 systemd-logind[1564]: New session 14 of user core. Dec 16 02:11:44.117771 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 02:11:44.121000 audit[5211]: USER_START pid=5211 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.126000 audit[5215]: CRED_ACQ pid=5215 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.130384 kernel: audit: type=1105 audit(1765851104.121:780): pid=5211 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.130694 kernel: audit: type=1103 audit(1765851104.126:781): pid=5215 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.450382 kubelet[2826]: E1216 02:11:44.450331 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:11:44.666023 sshd[5215]: Connection closed by 139.178.68.195 port 53822 Dec 16 02:11:44.668608 sshd-session[5211]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:44.670000 audit[5211]: USER_END pid=5211 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.677920 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 02:11:44.670000 audit[5211]: CRED_DISP pid=5211 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.681377 kernel: audit: type=1106 audit(1765851104.670:782): pid=5211 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.681466 kernel: audit: type=1104 audit(1765851104.670:783): pid=5211 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:44.681728 systemd[1]: sshd@12-49.13.61.135:22-139.178.68.195:53822.service: Deactivated successfully. Dec 16 02:11:44.681000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-49.13.61.135:22-139.178.68.195:53822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:44.686221 systemd-logind[1564]: Session 14 logged out. Waiting for processes to exit. Dec 16 02:11:44.689840 systemd-logind[1564]: Removed session 14. Dec 16 02:11:44.838353 systemd[1]: Started sshd@13-49.13.61.135:22-139.178.68.195:53828.service - OpenSSH per-connection server daemon (139.178.68.195:53828). Dec 16 02:11:44.838000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-49.13.61.135:22-139.178.68.195:53828 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:45.695000 audit[5227]: USER_ACCT pid=5227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:45.697831 sshd[5227]: Accepted publickey for core from 139.178.68.195 port 53828 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:45.698000 audit[5227]: CRED_ACQ pid=5227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:45.698000 audit[5227]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd50e1410 a2=3 a3=0 items=0 ppid=1 pid=5227 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:45.698000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:45.700557 sshd-session[5227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:45.708133 systemd-logind[1564]: New session 15 of user core. Dec 16 02:11:45.714353 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 02:11:45.718000 audit[5227]: USER_START pid=5227 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:45.722000 audit[5233]: CRED_ACQ pid=5233 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:46.429658 sshd[5233]: Connection closed by 139.178.68.195 port 53828 Dec 16 02:11:46.431145 sshd-session[5227]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:46.437000 audit[5227]: USER_END pid=5227 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:46.437000 audit[5227]: CRED_DISP pid=5227 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:46.445837 systemd[1]: sshd@13-49.13.61.135:22-139.178.68.195:53828.service: Deactivated successfully. Dec 16 02:11:46.445000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-49.13.61.135:22-139.178.68.195:53828 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:46.451136 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 02:11:46.453254 systemd-logind[1564]: Session 15 logged out. Waiting for processes to exit. Dec 16 02:11:46.458793 systemd-logind[1564]: Removed session 15. Dec 16 02:11:46.603000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-49.13.61.135:22-139.178.68.195:53832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:46.604313 systemd[1]: Started sshd@14-49.13.61.135:22-139.178.68.195:53832.service - OpenSSH per-connection server daemon (139.178.68.195:53832). Dec 16 02:11:47.453000 audit[5243]: USER_ACCT pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:47.455276 sshd[5243]: Accepted publickey for core from 139.178.68.195 port 53832 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:47.455000 audit[5243]: CRED_ACQ pid=5243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:47.455000 audit[5243]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdca61e80 a2=3 a3=0 items=0 ppid=1 pid=5243 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:47.455000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:47.457829 sshd-session[5243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:47.467822 systemd-logind[1564]: New session 16 of user core. Dec 16 02:11:47.471276 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 02:11:47.474000 audit[5243]: USER_START pid=5243 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:47.477000 audit[5247]: CRED_ACQ pid=5247 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:48.668859 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 02:11:48.669079 kernel: audit: type=1325 audit(1765851108.665:800): table=filter:138 family=2 entries=26 op=nft_register_rule pid=5260 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:11:48.665000 audit[5260]: NETFILTER_CFG table=filter:138 family=2 entries=26 op=nft_register_rule pid=5260 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:11:48.665000 audit[5260]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffffbfa5cf0 a2=0 a3=1 items=0 ppid=2986 pid=5260 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:48.672221 kernel: audit: type=1300 audit(1765851108.665:800): arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffffbfa5cf0 a2=0 a3=1 items=0 ppid=2986 pid=5260 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:48.665000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:11:48.674537 kernel: audit: type=1327 audit(1765851108.665:800): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:11:48.674000 audit[5260]: NETFILTER_CFG table=nat:139 family=2 entries=20 op=nft_register_rule pid=5260 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:11:48.677236 kernel: audit: type=1325 audit(1765851108.674:801): table=nat:139 family=2 entries=20 op=nft_register_rule pid=5260 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:11:48.674000 audit[5260]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffffbfa5cf0 a2=0 a3=1 items=0 ppid=2986 pid=5260 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:48.681260 kernel: audit: type=1300 audit(1765851108.674:801): arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffffbfa5cf0 a2=0 a3=1 items=0 ppid=2986 pid=5260 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:48.674000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:11:48.682899 kernel: audit: type=1327 audit(1765851108.674:801): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:11:48.815994 sshd[5247]: Connection closed by 139.178.68.195 port 53832 Dec 16 02:11:48.816354 sshd-session[5243]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:48.817000 audit[5243]: USER_END pid=5243 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:48.825858 systemd-logind[1564]: Session 16 logged out. Waiting for processes to exit. Dec 16 02:11:48.826041 systemd[1]: sshd@14-49.13.61.135:22-139.178.68.195:53832.service: Deactivated successfully. Dec 16 02:11:48.829128 kernel: audit: type=1106 audit(1765851108.817:802): pid=5243 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:48.829208 kernel: audit: type=1104 audit(1765851108.817:803): pid=5243 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:48.817000 audit[5243]: CRED_DISP pid=5243 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:48.825000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-49.13.61.135:22-139.178.68.195:53832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:48.831368 kernel: audit: type=1131 audit(1765851108.825:804): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-49.13.61.135:22-139.178.68.195:53832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:48.832167 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 02:11:48.838639 systemd-logind[1564]: Removed session 16. Dec 16 02:11:48.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-49.13.61.135:22-139.178.68.195:53840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:48.982468 systemd[1]: Started sshd@15-49.13.61.135:22-139.178.68.195:53840.service - OpenSSH per-connection server daemon (139.178.68.195:53840). Dec 16 02:11:48.988068 kernel: audit: type=1130 audit(1765851108.981:805): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-49.13.61.135:22-139.178.68.195:53840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:49.718000 audit[5269]: NETFILTER_CFG table=filter:140 family=2 entries=38 op=nft_register_rule pid=5269 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:11:49.718000 audit[5269]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffdd0f7ff0 a2=0 a3=1 items=0 ppid=2986 pid=5269 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:49.718000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:11:49.723000 audit[5269]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=5269 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:11:49.723000 audit[5269]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffdd0f7ff0 a2=0 a3=1 items=0 ppid=2986 pid=5269 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:49.723000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:11:49.829000 audit[5265]: USER_ACCT pid=5265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:49.830617 sshd[5265]: Accepted publickey for core from 139.178.68.195 port 53840 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:49.830000 audit[5265]: CRED_ACQ pid=5265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:49.830000 audit[5265]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdf114a90 a2=3 a3=0 items=0 ppid=1 pid=5265 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:49.830000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:49.833192 sshd-session[5265]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:49.842055 systemd-logind[1564]: New session 17 of user core. Dec 16 02:11:49.847360 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 02:11:49.856000 audit[5265]: USER_START pid=5265 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:49.858000 audit[5271]: CRED_ACQ pid=5271 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:50.543933 sshd[5271]: Connection closed by 139.178.68.195 port 53840 Dec 16 02:11:50.545481 sshd-session[5265]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:50.546000 audit[5265]: USER_END pid=5265 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:50.546000 audit[5265]: CRED_DISP pid=5265 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:50.552623 systemd-logind[1564]: Session 17 logged out. Waiting for processes to exit. Dec 16 02:11:50.554725 systemd[1]: sshd@15-49.13.61.135:22-139.178.68.195:53840.service: Deactivated successfully. Dec 16 02:11:50.555000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-49.13.61.135:22-139.178.68.195:53840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:50.560746 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 02:11:50.564085 systemd-logind[1564]: Removed session 17. Dec 16 02:11:50.713000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-49.13.61.135:22-139.178.68.195:41586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:50.714388 systemd[1]: Started sshd@16-49.13.61.135:22-139.178.68.195:41586.service - OpenSSH per-connection server daemon (139.178.68.195:41586). Dec 16 02:11:51.452504 kubelet[2826]: E1216 02:11:51.451866 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:11:51.455190 kubelet[2826]: E1216 02:11:51.455087 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:11:51.456712 kubelet[2826]: E1216 02:11:51.456636 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:11:51.561000 audit[5281]: USER_ACCT pid=5281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:51.563193 sshd[5281]: Accepted publickey for core from 139.178.68.195 port 41586 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:51.563000 audit[5281]: CRED_ACQ pid=5281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:51.563000 audit[5281]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc6644c70 a2=3 a3=0 items=0 ppid=1 pid=5281 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:51.563000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:51.566673 sshd-session[5281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:51.575588 systemd-logind[1564]: New session 18 of user core. Dec 16 02:11:51.587112 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 02:11:51.590000 audit[5281]: USER_START pid=5281 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:51.594000 audit[5285]: CRED_ACQ pid=5285 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:52.132643 sshd[5285]: Connection closed by 139.178.68.195 port 41586 Dec 16 02:11:52.133279 sshd-session[5281]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:52.134000 audit[5281]: USER_END pid=5281 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:52.134000 audit[5281]: CRED_DISP pid=5281 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:52.141288 systemd[1]: sshd@16-49.13.61.135:22-139.178.68.195:41586.service: Deactivated successfully. Dec 16 02:11:52.142000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-49.13.61.135:22-139.178.68.195:41586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:52.148248 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 02:11:52.152254 systemd-logind[1564]: Session 18 logged out. Waiting for processes to exit. Dec 16 02:11:52.155932 systemd-logind[1564]: Removed session 18. Dec 16 02:11:53.580000 audit[5299]: NETFILTER_CFG table=filter:142 family=2 entries=26 op=nft_register_rule pid=5299 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:11:53.580000 audit[5299]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffcb49c680 a2=0 a3=1 items=0 ppid=2986 pid=5299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:53.580000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:11:53.585000 audit[5299]: NETFILTER_CFG table=nat:143 family=2 entries=104 op=nft_register_chain pid=5299 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 02:11:53.585000 audit[5299]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffcb49c680 a2=0 a3=1 items=0 ppid=2986 pid=5299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:53.585000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 02:11:56.451937 kubelet[2826]: E1216 02:11:56.451798 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:11:56.454475 kubelet[2826]: E1216 02:11:56.452183 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:11:57.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-49.13.61.135:22-139.178.68.195:41588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:57.303468 kernel: kauditd_printk_skb: 33 callbacks suppressed Dec 16 02:11:57.303506 kernel: audit: type=1130 audit(1765851117.301:827): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-49.13.61.135:22-139.178.68.195:41588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:57.302249 systemd[1]: Started sshd@17-49.13.61.135:22-139.178.68.195:41588.service - OpenSSH per-connection server daemon (139.178.68.195:41588). Dec 16 02:11:58.137000 audit[5301]: USER_ACCT pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.141118 kernel: audit: type=1101 audit(1765851118.137:828): pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.141236 sshd[5301]: Accepted publickey for core from 139.178.68.195 port 41588 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:11:58.142000 audit[5301]: CRED_ACQ pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.144340 sshd-session[5301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:11:58.148298 kernel: audit: type=1103 audit(1765851118.142:829): pid=5301 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.148409 kernel: audit: type=1006 audit(1765851118.142:830): pid=5301 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Dec 16 02:11:58.142000 audit[5301]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdf54c140 a2=3 a3=0 items=0 ppid=1 pid=5301 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:58.151012 kernel: audit: type=1300 audit(1765851118.142:830): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdf54c140 a2=3 a3=0 items=0 ppid=1 pid=5301 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:11:58.151534 kernel: audit: type=1327 audit(1765851118.142:830): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:58.142000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:11:58.159101 systemd-logind[1564]: New session 19 of user core. Dec 16 02:11:58.169633 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 02:11:58.173000 audit[5301]: USER_START pid=5301 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.177000 audit[5305]: CRED_ACQ pid=5305 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.181871 kernel: audit: type=1105 audit(1765851118.173:831): pid=5301 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.181976 kernel: audit: type=1103 audit(1765851118.177:832): pid=5305 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.726805 sshd[5305]: Connection closed by 139.178.68.195 port 41588 Dec 16 02:11:58.727447 sshd-session[5301]: pam_unix(sshd:session): session closed for user core Dec 16 02:11:58.732000 audit[5301]: USER_END pid=5301 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.732000 audit[5301]: CRED_DISP pid=5301 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.738660 kernel: audit: type=1106 audit(1765851118.732:833): pid=5301 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.738742 kernel: audit: type=1104 audit(1765851118.732:834): pid=5301 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:11:58.739354 systemd[1]: sshd@17-49.13.61.135:22-139.178.68.195:41588.service: Deactivated successfully. Dec 16 02:11:58.738000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-49.13.61.135:22-139.178.68.195:41588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:11:58.743889 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 02:11:58.747197 systemd-logind[1564]: Session 19 logged out. Waiting for processes to exit. Dec 16 02:11:58.751481 systemd-logind[1564]: Removed session 19. Dec 16 02:11:59.454669 kubelet[2826]: E1216 02:11:59.454227 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:12:02.449782 kubelet[2826]: E1216 02:12:02.449672 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:12:03.896420 systemd[1]: Started sshd@18-49.13.61.135:22-139.178.68.195:41908.service - OpenSSH per-connection server daemon (139.178.68.195:41908). Dec 16 02:12:03.895000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-49.13.61.135:22-139.178.68.195:41908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:12:03.899251 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 02:12:03.899584 kernel: audit: type=1130 audit(1765851123.895:836): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-49.13.61.135:22-139.178.68.195:41908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:12:04.746000 audit[5344]: USER_ACCT pid=5344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:04.748343 sshd[5344]: Accepted publickey for core from 139.178.68.195 port 41908 ssh2: RSA SHA256:29Xpio+ELo8MGyKRWyN97HlQjkl70JsN7vQ26ExbU7g Dec 16 02:12:04.751107 kernel: audit: type=1101 audit(1765851124.746:837): pid=5344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:04.750000 audit[5344]: CRED_ACQ pid=5344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:04.752760 sshd-session[5344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 02:12:04.758016 kernel: audit: type=1103 audit(1765851124.750:838): pid=5344 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:04.758167 kernel: audit: type=1006 audit(1765851124.750:839): pid=5344 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Dec 16 02:12:04.750000 audit[5344]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff05c1830 a2=3 a3=0 items=0 ppid=1 pid=5344 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:04.760732 kernel: audit: type=1300 audit(1765851124.750:839): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff05c1830 a2=3 a3=0 items=0 ppid=1 pid=5344 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:04.750000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:12:04.762116 kernel: audit: type=1327 audit(1765851124.750:839): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 02:12:04.766267 systemd-logind[1564]: New session 20 of user core. Dec 16 02:12:04.771425 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 02:12:04.774000 audit[5344]: USER_START pid=5344 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:04.778000 audit[5348]: CRED_ACQ pid=5348 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:04.783210 kernel: audit: type=1105 audit(1765851124.774:840): pid=5344 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:04.783341 kernel: audit: type=1103 audit(1765851124.778:841): pid=5348 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:05.320235 sshd[5348]: Connection closed by 139.178.68.195 port 41908 Dec 16 02:12:05.321198 sshd-session[5344]: pam_unix(sshd:session): session closed for user core Dec 16 02:12:05.322000 audit[5344]: USER_END pid=5344 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:05.324000 audit[5344]: CRED_DISP pid=5344 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:05.331619 kernel: audit: type=1106 audit(1765851125.322:842): pid=5344 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:05.331725 kernel: audit: type=1104 audit(1765851125.324:843): pid=5344 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 02:12:05.332637 systemd-logind[1564]: Session 20 logged out. Waiting for processes to exit. Dec 16 02:12:05.333517 systemd[1]: sshd@18-49.13.61.135:22-139.178.68.195:41908.service: Deactivated successfully. Dec 16 02:12:05.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-49.13.61.135:22-139.178.68.195:41908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 02:12:05.340123 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 02:12:05.342925 systemd-logind[1564]: Removed session 20. Dec 16 02:12:05.453052 kubelet[2826]: E1216 02:12:05.452957 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:12:06.449737 kubelet[2826]: E1216 02:12:06.449647 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:12:09.452277 kubelet[2826]: E1216 02:12:09.452217 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:12:11.450369 kubelet[2826]: E1216 02:12:11.450052 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:12:13.449991 kubelet[2826]: E1216 02:12:13.449878 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:12:15.457223 kubelet[2826]: E1216 02:12:15.456843 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-xtmz8" podUID="24ee6e1b-64f9-47f5-86c3-17009d2e74c9" Dec 16 02:12:17.449536 kubelet[2826]: E1216 02:12:17.449432 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-75666888-t2jlw" podUID="a4f3ee57-42ce-4008-b96e-85199f6fd632" Dec 16 02:12:19.451519 kubelet[2826]: E1216 02:12:19.451393 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86cf67c95b-68pm7" podUID="4f109e8a-b6cd-4daa-a636-a987203ce9dc" Dec 16 02:12:19.844709 kubelet[2826]: E1216 02:12:19.844300 2826 controller.go:195] "Failed to update lease" err="Put \"https://49.13.61.135:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-9-be0981937a?timeout=10s\": context deadline exceeded" Dec 16 02:12:20.098702 kubelet[2826]: E1216 02:12:20.098571 2826 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:51508->10.0.0.2:2379: read: connection timed out" Dec 16 02:12:20.116986 systemd[1]: cri-containerd-925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605.scope: Deactivated successfully. Dec 16 02:12:20.117477 systemd[1]: cri-containerd-925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605.scope: Consumed 38.663s CPU time, 102.2M memory peak. Dec 16 02:12:20.124465 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 02:12:20.124601 kernel: audit: type=1334 audit(1765851140.122:845): prog-id=146 op=UNLOAD Dec 16 02:12:20.125614 kernel: audit: type=1334 audit(1765851140.122:846): prog-id=150 op=UNLOAD Dec 16 02:12:20.122000 audit: BPF prog-id=146 op=UNLOAD Dec 16 02:12:20.122000 audit: BPF prog-id=150 op=UNLOAD Dec 16 02:12:20.125820 containerd[1587]: time="2025-12-16T02:12:20.125744109Z" level=info msg="received container exit event container_id:\"925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605\" id:\"925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605\" pid:3156 exit_status:1 exited_at:{seconds:1765851140 nanos:120986736}" Dec 16 02:12:20.154740 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605-rootfs.mount: Deactivated successfully. Dec 16 02:12:20.265703 kubelet[2826]: I1216 02:12:20.265670 2826 scope.go:117] "RemoveContainer" containerID="925e589e3022db838d2a06e8e8afa1bb13cae2beedc73d274cdf9a917c7dd605" Dec 16 02:12:20.268779 containerd[1587]: time="2025-12-16T02:12:20.268734518Z" level=info msg="CreateContainer within sandbox \"308c511d8fce666b8915ae8dfb8b3cd7fb1cf254d057585b85c4bd6687cdc4ea\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 02:12:20.283127 containerd[1587]: time="2025-12-16T02:12:20.282930559Z" level=info msg="Container f45293e9a42eb48f1bcaca1f0225081fb70385e7c1bd9dff5b9e073fbda6170c: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:12:20.286848 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount383596897.mount: Deactivated successfully. Dec 16 02:12:20.294574 containerd[1587]: time="2025-12-16T02:12:20.294366351Z" level=info msg="CreateContainer within sandbox \"308c511d8fce666b8915ae8dfb8b3cd7fb1cf254d057585b85c4bd6687cdc4ea\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"f45293e9a42eb48f1bcaca1f0225081fb70385e7c1bd9dff5b9e073fbda6170c\"" Dec 16 02:12:20.295544 containerd[1587]: time="2025-12-16T02:12:20.295395474Z" level=info msg="StartContainer for \"f45293e9a42eb48f1bcaca1f0225081fb70385e7c1bd9dff5b9e073fbda6170c\"" Dec 16 02:12:20.299649 containerd[1587]: time="2025-12-16T02:12:20.299602646Z" level=info msg="connecting to shim f45293e9a42eb48f1bcaca1f0225081fb70385e7c1bd9dff5b9e073fbda6170c" address="unix:///run/containerd/s/eea70695d8e1f8d312e6eac1961b1d75231882d918380480220855a700a81cde" protocol=ttrpc version=3 Dec 16 02:12:20.339442 systemd[1]: Started cri-containerd-f45293e9a42eb48f1bcaca1f0225081fb70385e7c1bd9dff5b9e073fbda6170c.scope - libcontainer container f45293e9a42eb48f1bcaca1f0225081fb70385e7c1bd9dff5b9e073fbda6170c. Dec 16 02:12:20.357000 audit: BPF prog-id=256 op=LOAD Dec 16 02:12:20.361191 kernel: audit: type=1334 audit(1765851140.357:847): prog-id=256 op=LOAD Dec 16 02:12:20.360000 audit: BPF prog-id=257 op=LOAD Dec 16 02:12:20.362113 kernel: audit: type=1334 audit(1765851140.360:848): prog-id=257 op=LOAD Dec 16 02:12:20.360000 audit[5379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2889 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:20.360000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353239336539613432656234386631626361636131663032323530 Dec 16 02:12:20.367799 kernel: audit: type=1300 audit(1765851140.360:848): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2889 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:20.367872 kernel: audit: type=1327 audit(1765851140.360:848): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353239336539613432656234386631626361636131663032323530 Dec 16 02:12:20.367926 kernel: audit: type=1334 audit(1765851140.361:849): prog-id=257 op=UNLOAD Dec 16 02:12:20.361000 audit: BPF prog-id=257 op=UNLOAD Dec 16 02:12:20.361000 audit[5379]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:20.371020 kernel: audit: type=1300 audit(1765851140.361:849): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:20.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353239336539613432656234386631626361636131663032323530 Dec 16 02:12:20.374500 kernel: audit: type=1327 audit(1765851140.361:849): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353239336539613432656234386631626361636131663032323530 Dec 16 02:12:20.363000 audit: BPF prog-id=258 op=LOAD Dec 16 02:12:20.363000 audit[5379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2889 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:20.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353239336539613432656234386631626361636131663032323530 Dec 16 02:12:20.363000 audit: BPF prog-id=259 op=LOAD Dec 16 02:12:20.363000 audit[5379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2889 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:20.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353239336539613432656234386631626361636131663032323530 Dec 16 02:12:20.363000 audit: BPF prog-id=259 op=UNLOAD Dec 16 02:12:20.363000 audit[5379]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:20.376137 kernel: audit: type=1334 audit(1765851140.363:850): prog-id=258 op=LOAD Dec 16 02:12:20.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353239336539613432656234386631626361636131663032323530 Dec 16 02:12:20.363000 audit: BPF prog-id=258 op=UNLOAD Dec 16 02:12:20.363000 audit[5379]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2889 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:20.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353239336539613432656234386631626361636131663032323530 Dec 16 02:12:20.363000 audit: BPF prog-id=260 op=LOAD Dec 16 02:12:20.363000 audit[5379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2889 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:20.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634353239336539613432656234386631626361636131663032323530 Dec 16 02:12:20.399219 containerd[1587]: time="2025-12-16T02:12:20.399111970Z" level=info msg="StartContainer for \"f45293e9a42eb48f1bcaca1f0225081fb70385e7c1bd9dff5b9e073fbda6170c\" returns successfully" Dec 16 02:12:20.509172 systemd[1]: cri-containerd-ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361.scope: Deactivated successfully. Dec 16 02:12:20.513280 systemd[1]: cri-containerd-ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361.scope: Consumed 5.940s CPU time, 64.1M memory peak, 1.7M read from disk. Dec 16 02:12:20.514000 audit: BPF prog-id=261 op=LOAD Dec 16 02:12:20.514000 audit: BPF prog-id=83 op=UNLOAD Dec 16 02:12:20.515000 audit: BPF prog-id=102 op=UNLOAD Dec 16 02:12:20.515000 audit: BPF prog-id=107 op=UNLOAD Dec 16 02:12:20.516923 containerd[1587]: time="2025-12-16T02:12:20.516854227Z" level=info msg="received container exit event container_id:\"ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361\" id:\"ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361\" pid:2673 exit_status:1 exited_at:{seconds:1765851140 nanos:516369105}" Dec 16 02:12:20.543940 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361-rootfs.mount: Deactivated successfully. Dec 16 02:12:21.272424 kubelet[2826]: I1216 02:12:21.272164 2826 scope.go:117] "RemoveContainer" containerID="ab3a6c1fdc84dc2a895b2e11c4011e2164a99dbe66b2af2063ec6ed6b34ae361" Dec 16 02:12:21.276105 containerd[1587]: time="2025-12-16T02:12:21.276060461Z" level=info msg="CreateContainer within sandbox \"8c83ba9049ba260e396bf8e549021049793142013d799e542077147dc6e8f64d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 02:12:21.288407 containerd[1587]: time="2025-12-16T02:12:21.287941345Z" level=info msg="Container 9a94174211a75d19e36d98907c1715caf72d0d9bddb5a84c0103b4f8a267cd71: CDI devices from CRI Config.CDIDevices: []" Dec 16 02:12:21.337451 containerd[1587]: time="2025-12-16T02:12:21.337384007Z" level=info msg="CreateContainer within sandbox \"8c83ba9049ba260e396bf8e549021049793142013d799e542077147dc6e8f64d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"9a94174211a75d19e36d98907c1715caf72d0d9bddb5a84c0103b4f8a267cd71\"" Dec 16 02:12:21.338048 containerd[1587]: time="2025-12-16T02:12:21.337996729Z" level=info msg="StartContainer for \"9a94174211a75d19e36d98907c1715caf72d0d9bddb5a84c0103b4f8a267cd71\"" Dec 16 02:12:21.339695 containerd[1587]: time="2025-12-16T02:12:21.339593855Z" level=info msg="connecting to shim 9a94174211a75d19e36d98907c1715caf72d0d9bddb5a84c0103b4f8a267cd71" address="unix:///run/containerd/s/cddd11251b03c2bd2e00c0064853b37dba3f2d59e2930e4aa6214688538ce9ce" protocol=ttrpc version=3 Dec 16 02:12:21.365272 systemd[1]: Started cri-containerd-9a94174211a75d19e36d98907c1715caf72d0d9bddb5a84c0103b4f8a267cd71.scope - libcontainer container 9a94174211a75d19e36d98907c1715caf72d0d9bddb5a84c0103b4f8a267cd71. Dec 16 02:12:21.379000 audit: BPF prog-id=262 op=LOAD Dec 16 02:12:21.380000 audit: BPF prog-id=263 op=LOAD Dec 16 02:12:21.380000 audit[5425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=2496 pid=5425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:21.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961393431373432313161373564313965333664393839303763313731 Dec 16 02:12:21.380000 audit: BPF prog-id=263 op=UNLOAD Dec 16 02:12:21.380000 audit[5425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2496 pid=5425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:21.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961393431373432313161373564313965333664393839303763313731 Dec 16 02:12:21.380000 audit: BPF prog-id=264 op=LOAD Dec 16 02:12:21.380000 audit[5425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=2496 pid=5425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:21.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961393431373432313161373564313965333664393839303763313731 Dec 16 02:12:21.380000 audit: BPF prog-id=265 op=LOAD Dec 16 02:12:21.380000 audit[5425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=2496 pid=5425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:21.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961393431373432313161373564313965333664393839303763313731 Dec 16 02:12:21.380000 audit: BPF prog-id=265 op=UNLOAD Dec 16 02:12:21.380000 audit[5425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2496 pid=5425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:21.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961393431373432313161373564313965333664393839303763313731 Dec 16 02:12:21.380000 audit: BPF prog-id=264 op=UNLOAD Dec 16 02:12:21.380000 audit[5425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2496 pid=5425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:21.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961393431373432313161373564313965333664393839303763313731 Dec 16 02:12:21.380000 audit: BPF prog-id=266 op=LOAD Dec 16 02:12:21.380000 audit[5425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=2496 pid=5425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 02:12:21.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961393431373432313161373564313965333664393839303763313731 Dec 16 02:12:21.413251 containerd[1587]: time="2025-12-16T02:12:21.413200886Z" level=info msg="StartContainer for \"9a94174211a75d19e36d98907c1715caf72d0d9bddb5a84c0103b4f8a267cd71\" returns successfully" Dec 16 02:12:21.451131 containerd[1587]: time="2025-12-16T02:12:21.451005025Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 02:12:21.821153 containerd[1587]: time="2025-12-16T02:12:21.820868265Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:12:21.822438 containerd[1587]: time="2025-12-16T02:12:21.822268870Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 02:12:21.822438 containerd[1587]: time="2025-12-16T02:12:21.822382350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 02:12:21.822884 kubelet[2826]: E1216 02:12:21.822815 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:12:21.823224 kubelet[2826]: E1216 02:12:21.822865 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 02:12:21.823522 kubelet[2826]: E1216 02:12:21.823456 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-b648c4bbd-x7xsf_calico-system(76743683-d50e-4bc9-aceb-a84e73d5c7be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 02:12:21.824842 kubelet[2826]: E1216 02:12:21.824776 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-b648c4bbd-x7xsf" podUID="76743683-d50e-4bc9-aceb-a84e73d5c7be" Dec 16 02:12:22.450215 containerd[1587]: time="2025-12-16T02:12:22.450054742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 02:12:22.815171 containerd[1587]: time="2025-12-16T02:12:22.814951898Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:12:22.817338 containerd[1587]: time="2025-12-16T02:12:22.817153868Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 02:12:22.817338 containerd[1587]: time="2025-12-16T02:12:22.817284429Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 02:12:22.817762 kubelet[2826]: E1216 02:12:22.817689 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:12:22.818486 kubelet[2826]: E1216 02:12:22.818158 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 02:12:22.818486 kubelet[2826]: E1216 02:12:22.818337 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 02:12:22.819682 containerd[1587]: time="2025-12-16T02:12:22.819604399Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 02:12:23.147511 containerd[1587]: time="2025-12-16T02:12:23.146951944Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:12:23.149405 containerd[1587]: time="2025-12-16T02:12:23.148967474Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 02:12:23.150225 kubelet[2826]: E1216 02:12:23.149924 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:12:23.150225 kubelet[2826]: E1216 02:12:23.149988 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 02:12:23.150225 kubelet[2826]: E1216 02:12:23.150079 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-rp5zk_calico-system(919dd2b2-2bc2-4394-9fed-3f9f47f938e5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 02:12:23.150225 kubelet[2826]: E1216 02:12:23.150124 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rp5zk" podUID="919dd2b2-2bc2-4394-9fed-3f9f47f938e5" Dec 16 02:12:23.150467 containerd[1587]: time="2025-12-16T02:12:23.149113155Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 02:12:24.456366 containerd[1587]: time="2025-12-16T02:12:24.456313734Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 02:12:24.796320 containerd[1587]: time="2025-12-16T02:12:24.795830593Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 02:12:24.797422 containerd[1587]: time="2025-12-16T02:12:24.797335802Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 02:12:24.797551 containerd[1587]: time="2025-12-16T02:12:24.797449843Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 02:12:24.797882 kubelet[2826]: E1216 02:12:24.797772 2826 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:12:24.797882 kubelet[2826]: E1216 02:12:24.797845 2826 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 02:12:24.798453 kubelet[2826]: E1216 02:12:24.797947 2826 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-cl8m7_calico-system(e4bfcf46-398e-437f-b7f1-81589479eeb2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 02:12:24.798453 kubelet[2826]: E1216 02:12:24.798055 2826 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-cl8m7" podUID="e4bfcf46-398e-437f-b7f1-81589479eeb2" Dec 16 02:12:25.654675 systemd[1]: cri-containerd-0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96.scope: Deactivated successfully. Dec 16 02:12:25.655077 systemd[1]: cri-containerd-0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96.scope: Consumed 4.456s CPU time, 26M memory peak, 3.4M read from disk. Dec 16 02:12:25.657414 kernel: kauditd_printk_skb: 40 callbacks suppressed Dec 16 02:12:25.657527 kernel: audit: type=1334 audit(1765851145.654:867): prog-id=267 op=LOAD Dec 16 02:12:25.654000 audit: BPF prog-id=267 op=LOAD Dec 16 02:12:25.654000 audit: BPF prog-id=93 op=UNLOAD Dec 16 02:12:25.658380 kernel: audit: type=1334 audit(1765851145.654:868): prog-id=93 op=UNLOAD Dec 16 02:12:25.661871 kernel: audit: type=1334 audit(1765851145.658:869): prog-id=108 op=UNLOAD Dec 16 02:12:25.662121 kernel: audit: type=1334 audit(1765851145.658:870): prog-id=112 op=UNLOAD Dec 16 02:12:25.658000 audit: BPF prog-id=108 op=UNLOAD Dec 16 02:12:25.658000 audit: BPF prog-id=112 op=UNLOAD Dec 16 02:12:25.662518 containerd[1587]: time="2025-12-16T02:12:25.661483911Z" level=info msg="received container exit event container_id:\"0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96\" id:\"0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96\" pid:2681 exit_status:1 exited_at:{seconds:1765851145 nanos:660742106}" Dec 16 02:12:25.690411 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0fc1e0e40890f3b5ada9bc6499e87cce560de40f7efd5785ca4b5528a6808b96-rootfs.mount: Deactivated successfully.