Dec 16 12:53:11.161707 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Fri Dec 12 15:17:57 -00 2025 Dec 16 12:53:11.161732 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=4dd8de2ff094d97322e7371b16ddee5fc8348868bcdd9ec7bcd11ea9d3933fee Dec 16 12:53:11.161743 kernel: BIOS-provided physical RAM map: Dec 16 12:53:11.161749 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Dec 16 12:53:11.161754 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Dec 16 12:53:11.161759 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Dec 16 12:53:11.161765 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Dec 16 12:53:11.161771 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Dec 16 12:53:11.161778 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Dec 16 12:53:11.161783 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Dec 16 12:53:11.161788 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 12:53:11.161793 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Dec 16 12:53:11.161798 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Dec 16 12:53:11.161804 kernel: NX (Execute Disable) protection: active Dec 16 12:53:11.161811 kernel: APIC: Static calls initialized Dec 16 12:53:11.161817 kernel: SMBIOS 3.0.0 present. Dec 16 12:53:11.161822 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Dec 16 12:53:11.161828 kernel: DMI: Memory slots populated: 1/1 Dec 16 12:53:11.161833 kernel: Hypervisor detected: KVM Dec 16 12:53:11.161839 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Dec 16 12:53:11.161844 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 16 12:53:11.161850 kernel: kvm-clock: using sched offset of 3959575048 cycles Dec 16 12:53:11.161856 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 16 12:53:11.161863 kernel: tsc: Detected 2445.404 MHz processor Dec 16 12:53:11.161869 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 16 12:53:11.161876 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 16 12:53:11.161882 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Dec 16 12:53:11.161888 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Dec 16 12:53:11.161894 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 16 12:53:11.161900 kernel: Using GB pages for direct mapping Dec 16 12:53:11.161906 kernel: ACPI: Early table checksum verification disabled Dec 16 12:53:11.161914 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Dec 16 12:53:11.161920 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:53:11.161926 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:53:11.161932 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:53:11.161938 kernel: ACPI: FACS 0x000000007CFE0000 000040 Dec 16 12:53:11.161944 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:53:11.161950 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:53:11.161957 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:53:11.161963 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:53:11.161971 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Dec 16 12:53:11.161977 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Dec 16 12:53:11.161984 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Dec 16 12:53:11.161991 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Dec 16 12:53:11.161997 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Dec 16 12:53:11.162003 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Dec 16 12:53:11.162009 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Dec 16 12:53:11.162016 kernel: No NUMA configuration found Dec 16 12:53:11.162022 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Dec 16 12:53:11.162029 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Dec 16 12:53:11.162035 kernel: Zone ranges: Dec 16 12:53:11.162041 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 16 12:53:11.162048 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Dec 16 12:53:11.162054 kernel: Normal empty Dec 16 12:53:11.162060 kernel: Device empty Dec 16 12:53:11.162066 kernel: Movable zone start for each node Dec 16 12:53:11.162072 kernel: Early memory node ranges Dec 16 12:53:11.162079 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Dec 16 12:53:11.162085 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Dec 16 12:53:11.162092 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Dec 16 12:53:11.162097 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 12:53:11.162103 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Dec 16 12:53:11.162110 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Dec 16 12:53:11.162116 kernel: ACPI: PM-Timer IO Port: 0x608 Dec 16 12:53:11.162123 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 16 12:53:11.162129 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 16 12:53:11.162136 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 16 12:53:11.162142 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 16 12:53:11.162148 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 16 12:53:11.162154 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 16 12:53:11.162160 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 16 12:53:11.162167 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 16 12:53:11.162174 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Dec 16 12:53:11.162180 kernel: CPU topo: Max. logical packages: 1 Dec 16 12:53:11.162186 kernel: CPU topo: Max. logical dies: 1 Dec 16 12:53:11.162192 kernel: CPU topo: Max. dies per package: 1 Dec 16 12:53:11.162198 kernel: CPU topo: Max. threads per core: 1 Dec 16 12:53:11.162204 kernel: CPU topo: Num. cores per package: 2 Dec 16 12:53:11.162210 kernel: CPU topo: Num. threads per package: 2 Dec 16 12:53:11.162217 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Dec 16 12:53:11.162223 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Dec 16 12:53:11.162230 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Dec 16 12:53:11.162236 kernel: Booting paravirtualized kernel on KVM Dec 16 12:53:11.162242 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 16 12:53:11.162249 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Dec 16 12:53:11.162255 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Dec 16 12:53:11.162261 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Dec 16 12:53:11.162268 kernel: pcpu-alloc: [0] 0 1 Dec 16 12:53:11.162274 kernel: kvm-guest: PV spinlocks disabled, no host support Dec 16 12:53:11.162281 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=4dd8de2ff094d97322e7371b16ddee5fc8348868bcdd9ec7bcd11ea9d3933fee Dec 16 12:53:11.162288 kernel: random: crng init done Dec 16 12:53:11.162294 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 12:53:11.162301 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 12:53:11.162308 kernel: Fallback order for Node 0: 0 Dec 16 12:53:11.162314 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Dec 16 12:53:11.162320 kernel: Policy zone: DMA32 Dec 16 12:53:11.162326 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 12:53:11.162333 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 16 12:53:11.162339 kernel: ftrace: allocating 40103 entries in 157 pages Dec 16 12:53:11.162345 kernel: ftrace: allocated 157 pages with 5 groups Dec 16 12:53:11.162352 kernel: Dynamic Preempt: voluntary Dec 16 12:53:11.162358 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 12:53:11.162365 kernel: rcu: RCU event tracing is enabled. Dec 16 12:53:11.162372 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 16 12:53:11.162378 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 12:53:11.162384 kernel: Rude variant of Tasks RCU enabled. Dec 16 12:53:11.162390 kernel: Tracing variant of Tasks RCU enabled. Dec 16 12:53:11.162396 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 12:53:11.162404 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 16 12:53:11.162410 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:53:11.162416 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:53:11.162423 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:53:11.162429 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Dec 16 12:53:11.162435 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 12:53:11.162441 kernel: Console: colour VGA+ 80x25 Dec 16 12:53:11.162449 kernel: printk: legacy console [tty0] enabled Dec 16 12:53:11.162455 kernel: printk: legacy console [ttyS0] enabled Dec 16 12:53:11.162461 kernel: ACPI: Core revision 20240827 Dec 16 12:53:11.162471 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Dec 16 12:53:11.162479 kernel: APIC: Switch to symmetric I/O mode setup Dec 16 12:53:11.162485 kernel: x2apic enabled Dec 16 12:53:11.162492 kernel: APIC: Switched APIC routing to: physical x2apic Dec 16 12:53:11.162498 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Dec 16 12:53:11.162505 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns Dec 16 12:53:11.162512 kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404) Dec 16 12:53:11.162519 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 16 12:53:11.162526 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Dec 16 12:53:11.162532 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Dec 16 12:53:11.162540 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 16 12:53:11.162546 kernel: Spectre V2 : Mitigation: Retpolines Dec 16 12:53:11.162553 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Dec 16 12:53:11.162559 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Dec 16 12:53:11.162566 kernel: active return thunk: retbleed_return_thunk Dec 16 12:53:11.162572 kernel: RETBleed: Mitigation: untrained return thunk Dec 16 12:53:11.162578 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 16 12:53:11.162586 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 16 12:53:11.162592 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 16 12:53:11.162599 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 16 12:53:11.162605 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 16 12:53:11.162612 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 16 12:53:11.162618 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Dec 16 12:53:11.162625 kernel: Freeing SMP alternatives memory: 32K Dec 16 12:53:11.162632 kernel: pid_max: default: 32768 minimum: 301 Dec 16 12:53:11.162639 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 12:53:11.162658 kernel: landlock: Up and running. Dec 16 12:53:11.162664 kernel: SELinux: Initializing. Dec 16 12:53:11.162671 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 16 12:53:11.162677 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 16 12:53:11.162684 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Dec 16 12:53:11.162691 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Dec 16 12:53:11.162708 kernel: ... version: 0 Dec 16 12:53:11.162715 kernel: ... bit width: 48 Dec 16 12:53:11.162721 kernel: ... generic registers: 6 Dec 16 12:53:11.162727 kernel: ... value mask: 0000ffffffffffff Dec 16 12:53:11.162734 kernel: ... max period: 00007fffffffffff Dec 16 12:53:11.162740 kernel: ... fixed-purpose events: 0 Dec 16 12:53:11.162748 kernel: ... event mask: 000000000000003f Dec 16 12:53:11.162755 kernel: signal: max sigframe size: 1776 Dec 16 12:53:11.162761 kernel: rcu: Hierarchical SRCU implementation. Dec 16 12:53:11.162768 kernel: rcu: Max phase no-delay instances is 400. Dec 16 12:53:11.162774 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 12:53:11.162781 kernel: smp: Bringing up secondary CPUs ... Dec 16 12:53:11.162787 kernel: smpboot: x86: Booting SMP configuration: Dec 16 12:53:11.162794 kernel: .... node #0, CPUs: #1 Dec 16 12:53:11.162801 kernel: smp: Brought up 1 node, 2 CPUs Dec 16 12:53:11.162808 kernel: smpboot: Total of 2 processors activated (9781.61 BogoMIPS) Dec 16 12:53:11.162814 kernel: Memory: 1936212K/2047464K available (14336K kernel code, 2444K rwdata, 29892K rodata, 15464K init, 2576K bss, 106708K reserved, 0K cma-reserved) Dec 16 12:53:11.162821 kernel: devtmpfs: initialized Dec 16 12:53:11.162827 kernel: x86/mm: Memory block size: 128MB Dec 16 12:53:11.162834 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 12:53:11.162840 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 16 12:53:11.162848 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 12:53:11.162854 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 12:53:11.162861 kernel: audit: initializing netlink subsys (disabled) Dec 16 12:53:11.162867 kernel: audit: type=2000 audit(1765889587.856:1): state=initialized audit_enabled=0 res=1 Dec 16 12:53:11.162874 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 12:53:11.162880 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 16 12:53:11.162887 kernel: cpuidle: using governor menu Dec 16 12:53:11.162894 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 12:53:11.162901 kernel: dca service started, version 1.12.1 Dec 16 12:53:11.162907 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Dec 16 12:53:11.162914 kernel: PCI: Using configuration type 1 for base access Dec 16 12:53:11.162920 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 16 12:53:11.162927 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 12:53:11.162933 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 12:53:11.162941 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 12:53:11.162948 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 12:53:11.162954 kernel: ACPI: Added _OSI(Module Device) Dec 16 12:53:11.162960 kernel: ACPI: Added _OSI(Processor Device) Dec 16 12:53:11.162967 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 12:53:11.162973 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 12:53:11.162979 kernel: ACPI: Interpreter enabled Dec 16 12:53:11.162987 kernel: ACPI: PM: (supports S0 S5) Dec 16 12:53:11.162993 kernel: ACPI: Using IOAPIC for interrupt routing Dec 16 12:53:11.163000 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 16 12:53:11.163006 kernel: PCI: Using E820 reservations for host bridge windows Dec 16 12:53:11.163013 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Dec 16 12:53:11.163019 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 12:53:11.163158 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 12:53:11.163250 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Dec 16 12:53:11.163332 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Dec 16 12:53:11.163342 kernel: PCI host bridge to bus 0000:00 Dec 16 12:53:11.163423 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 16 12:53:11.163496 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 16 12:53:11.163570 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 16 12:53:11.163640 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Dec 16 12:53:11.163745 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Dec 16 12:53:11.163817 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Dec 16 12:53:11.163886 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 12:53:11.163978 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Dec 16 12:53:11.164073 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Dec 16 12:53:11.164155 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Dec 16 12:53:11.164235 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Dec 16 12:53:11.164318 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Dec 16 12:53:11.164397 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Dec 16 12:53:11.164480 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 16 12:53:11.164566 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:53:11.164657 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Dec 16 12:53:11.164757 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:53:11.164838 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Dec 16 12:53:11.164916 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Dec 16 12:53:11.165053 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:53:11.165269 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Dec 16 12:53:11.165418 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:53:11.165505 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Dec 16 12:53:11.165585 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 12:53:11.165690 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:53:11.166745 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Dec 16 12:53:11.166833 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:53:11.166913 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Dec 16 12:53:11.166992 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 12:53:11.167078 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:53:11.167164 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Dec 16 12:53:11.167243 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:53:11.167346 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Dec 16 12:53:11.167428 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 12:53:11.167604 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:53:11.167728 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Dec 16 12:53:11.167818 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:53:11.167899 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Dec 16 12:53:11.167979 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 12:53:11.168064 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:53:11.168142 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Dec 16 12:53:11.168219 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:53:11.168299 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Dec 16 12:53:11.168376 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 12:53:11.168458 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:53:11.168536 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Dec 16 12:53:11.168613 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:53:11.168744 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Dec 16 12:53:11.168890 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 12:53:11.169053 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:53:11.169139 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Dec 16 12:53:11.169219 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:53:11.169301 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Dec 16 12:53:11.169382 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 12:53:11.169468 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:53:11.169548 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Dec 16 12:53:11.169625 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:53:11.169749 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 16 12:53:11.169832 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 12:53:11.169919 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Dec 16 12:53:11.169998 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Dec 16 12:53:11.170081 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Dec 16 12:53:11.170159 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Dec 16 12:53:11.170237 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Dec 16 12:53:11.170319 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Dec 16 12:53:11.170401 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Dec 16 12:53:11.170488 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:53:11.170569 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Dec 16 12:53:11.170662 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Dec 16 12:53:11.170764 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Dec 16 12:53:11.170849 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:53:11.171068 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 12:53:11.171158 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Dec 16 12:53:11.171238 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:53:11.171324 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Dec 16 12:53:11.171404 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Dec 16 12:53:11.171489 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Dec 16 12:53:11.173027 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:53:11.173155 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:53:11.173250 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Dec 16 12:53:11.173333 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:53:11.173426 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:53:11.173513 kernel: pci 0000:05:00.0: BAR 1 [mem 0xfe000000-0xfe000fff] Dec 16 12:53:11.173606 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Dec 16 12:53:11.173784 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:53:11.173912 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Dec 16 12:53:11.173999 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Dec 16 12:53:11.174604 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Dec 16 12:53:11.174730 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:53:11.174742 kernel: acpiphp: Slot [0] registered Dec 16 12:53:11.175070 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:53:11.175157 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Dec 16 12:53:11.175238 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Dec 16 12:53:11.175323 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Dec 16 12:53:11.175402 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:53:11.175411 kernel: acpiphp: Slot [0-2] registered Dec 16 12:53:11.175487 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:53:11.175497 kernel: acpiphp: Slot [0-3] registered Dec 16 12:53:11.175572 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:53:11.175585 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 16 12:53:11.175591 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 16 12:53:11.175598 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 16 12:53:11.175605 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 16 12:53:11.175611 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Dec 16 12:53:11.175618 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Dec 16 12:53:11.175624 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Dec 16 12:53:11.175632 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Dec 16 12:53:11.175638 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Dec 16 12:53:11.175660 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Dec 16 12:53:11.175667 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Dec 16 12:53:11.175673 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Dec 16 12:53:11.175680 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Dec 16 12:53:11.175687 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Dec 16 12:53:11.175712 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Dec 16 12:53:11.175720 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Dec 16 12:53:11.175727 kernel: iommu: Default domain type: Translated Dec 16 12:53:11.175733 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 16 12:53:11.175740 kernel: PCI: Using ACPI for IRQ routing Dec 16 12:53:11.175746 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 16 12:53:11.175753 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Dec 16 12:53:11.175762 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Dec 16 12:53:11.175848 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Dec 16 12:53:11.175927 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Dec 16 12:53:11.176006 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 16 12:53:11.176015 kernel: vgaarb: loaded Dec 16 12:53:11.176022 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Dec 16 12:53:11.176029 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Dec 16 12:53:11.176038 kernel: clocksource: Switched to clocksource kvm-clock Dec 16 12:53:11.176044 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 12:53:11.176051 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 12:53:11.176058 kernel: pnp: PnP ACPI init Dec 16 12:53:11.176143 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Dec 16 12:53:11.176154 kernel: pnp: PnP ACPI: found 5 devices Dec 16 12:53:11.176161 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 16 12:53:11.176170 kernel: NET: Registered PF_INET protocol family Dec 16 12:53:11.176176 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:53:11.176183 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Dec 16 12:53:11.176190 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 12:53:11.176196 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Dec 16 12:53:11.176203 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Dec 16 12:53:11.176209 kernel: TCP: Hash tables configured (established 16384 bind 16384) Dec 16 12:53:11.176217 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 16 12:53:11.176224 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 16 12:53:11.176230 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 12:53:11.176237 kernel: NET: Registered PF_XDP protocol family Dec 16 12:53:11.176315 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 12:53:11.176394 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 12:53:11.176471 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 12:53:11.176550 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:53:11.176628 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:53:11.177373 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:53:11.177469 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:53:11.177550 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Dec 16 12:53:11.177629 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Dec 16 12:53:11.177750 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:53:11.177832 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Dec 16 12:53:11.177912 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 12:53:11.177991 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:53:11.178069 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Dec 16 12:53:11.178147 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 12:53:11.178226 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:53:11.178305 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Dec 16 12:53:11.178389 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 12:53:11.178468 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:53:11.178547 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Dec 16 12:53:11.178625 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 12:53:11.178741 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:53:11.178902 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Dec 16 12:53:11.179005 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 12:53:11.179085 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:53:11.179167 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Dec 16 12:53:11.179247 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Dec 16 12:53:11.179394 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 12:53:11.179481 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:53:11.179559 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Dec 16 12:53:11.179636 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Dec 16 12:53:11.179747 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 12:53:11.180319 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:53:11.180405 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Dec 16 12:53:11.180483 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 16 12:53:11.180560 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 12:53:11.180748 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 16 12:53:11.180929 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 16 12:53:11.181018 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 16 12:53:11.181093 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Dec 16 12:53:11.181169 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Dec 16 12:53:11.181238 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Dec 16 12:53:11.181320 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Dec 16 12:53:11.181394 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Dec 16 12:53:11.181471 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Dec 16 12:53:11.181545 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 12:53:11.181630 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Dec 16 12:53:11.181757 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 12:53:11.181837 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Dec 16 12:53:11.181910 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 12:53:11.181989 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Dec 16 12:53:11.182066 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 12:53:11.182141 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Dec 16 12:53:11.182213 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 12:53:11.182290 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Dec 16 12:53:11.182362 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Dec 16 12:53:11.182436 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 12:53:11.182512 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Dec 16 12:53:11.182584 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Dec 16 12:53:11.182670 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 12:53:11.182776 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Dec 16 12:53:11.182851 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Dec 16 12:53:11.182926 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 12:53:11.182937 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Dec 16 12:53:11.182944 kernel: PCI: CLS 0 bytes, default 64 Dec 16 12:53:11.182951 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns Dec 16 12:53:11.182958 kernel: Initialise system trusted keyrings Dec 16 12:53:11.182966 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Dec 16 12:53:11.182974 kernel: Key type asymmetric registered Dec 16 12:53:11.182981 kernel: Asymmetric key parser 'x509' registered Dec 16 12:53:11.182988 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 16 12:53:11.182995 kernel: io scheduler mq-deadline registered Dec 16 12:53:11.183002 kernel: io scheduler kyber registered Dec 16 12:53:11.183009 kernel: io scheduler bfq registered Dec 16 12:53:11.183087 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Dec 16 12:53:11.183165 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Dec 16 12:53:11.183246 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Dec 16 12:53:11.183326 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Dec 16 12:53:11.183404 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Dec 16 12:53:11.183481 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Dec 16 12:53:11.183557 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Dec 16 12:53:11.183633 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Dec 16 12:53:11.183747 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Dec 16 12:53:11.183826 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Dec 16 12:53:11.183903 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Dec 16 12:53:11.183979 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Dec 16 12:53:11.184055 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Dec 16 12:53:11.184130 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Dec 16 12:53:11.184211 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Dec 16 12:53:11.184287 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Dec 16 12:53:11.184297 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Dec 16 12:53:11.184372 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Dec 16 12:53:11.184451 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Dec 16 12:53:11.184462 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 16 12:53:11.184471 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Dec 16 12:53:11.184478 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 12:53:11.184485 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 16 12:53:11.184492 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 16 12:53:11.184499 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 16 12:53:11.184506 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 16 12:53:11.184588 kernel: rtc_cmos 00:03: RTC can wake from S4 Dec 16 12:53:11.184599 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 16 12:53:11.184682 kernel: rtc_cmos 00:03: registered as rtc0 Dec 16 12:53:11.184775 kernel: rtc_cmos 00:03: setting system clock to 2025-12-16T12:53:09 UTC (1765889589) Dec 16 12:53:11.184851 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Dec 16 12:53:11.184860 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Dec 16 12:53:11.184871 kernel: NET: Registered PF_INET6 protocol family Dec 16 12:53:11.184878 kernel: Segment Routing with IPv6 Dec 16 12:53:11.184885 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 12:53:11.184891 kernel: NET: Registered PF_PACKET protocol family Dec 16 12:53:11.184898 kernel: Key type dns_resolver registered Dec 16 12:53:11.184906 kernel: IPI shorthand broadcast: enabled Dec 16 12:53:11.184912 kernel: sched_clock: Marking stable (2219009420, 250490315)->(2494659419, -25159684) Dec 16 12:53:11.184919 kernel: registered taskstats version 1 Dec 16 12:53:11.184928 kernel: Loading compiled-in X.509 certificates Dec 16 12:53:11.184934 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: b90706f42f055ab9f35fc8fc29156d877adb12c4' Dec 16 12:53:11.184941 kernel: Demotion targets for Node 0: null Dec 16 12:53:11.184948 kernel: Key type .fscrypt registered Dec 16 12:53:11.184955 kernel: Key type fscrypt-provisioning registered Dec 16 12:53:11.184962 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 12:53:11.184968 kernel: ima: Allocated hash algorithm: sha1 Dec 16 12:53:11.184976 kernel: ima: No architecture policies found Dec 16 12:53:11.184983 kernel: clk: Disabling unused clocks Dec 16 12:53:11.184990 kernel: Freeing unused kernel image (initmem) memory: 15464K Dec 16 12:53:11.184997 kernel: Write protecting the kernel read-only data: 45056k Dec 16 12:53:11.185004 kernel: Freeing unused kernel image (rodata/data gap) memory: 828K Dec 16 12:53:11.185011 kernel: Run /init as init process Dec 16 12:53:11.185018 kernel: with arguments: Dec 16 12:53:11.185026 kernel: /init Dec 16 12:53:11.185033 kernel: with environment: Dec 16 12:53:11.185039 kernel: HOME=/ Dec 16 12:53:11.185046 kernel: TERM=linux Dec 16 12:53:11.185053 kernel: ACPI: bus type USB registered Dec 16 12:53:11.185060 kernel: usbcore: registered new interface driver usbfs Dec 16 12:53:11.185066 kernel: usbcore: registered new interface driver hub Dec 16 12:53:11.185073 kernel: usbcore: registered new device driver usb Dec 16 12:53:11.185157 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:53:11.185239 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 12:53:11.185320 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 12:53:11.185399 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:53:11.185479 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 12:53:11.185557 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 12:53:11.185674 kernel: hub 1-0:1.0: USB hub found Dec 16 12:53:11.185780 kernel: hub 1-0:1.0: 4 ports detected Dec 16 12:53:11.185879 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 12:53:11.185976 kernel: hub 2-0:1.0: USB hub found Dec 16 12:53:11.186064 kernel: hub 2-0:1.0: 4 ports detected Dec 16 12:53:11.186076 kernel: SCSI subsystem initialized Dec 16 12:53:11.186084 kernel: libata version 3.00 loaded. Dec 16 12:53:11.186165 kernel: ahci 0000:00:1f.2: version 3.0 Dec 16 12:53:11.186175 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Dec 16 12:53:11.186255 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Dec 16 12:53:11.186334 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Dec 16 12:53:11.186415 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Dec 16 12:53:11.186503 kernel: scsi host0: ahci Dec 16 12:53:11.186587 kernel: scsi host1: ahci Dec 16 12:53:11.186687 kernel: scsi host2: ahci Dec 16 12:53:11.186789 kernel: scsi host3: ahci Dec 16 12:53:11.186877 kernel: scsi host4: ahci Dec 16 12:53:11.186959 kernel: scsi host5: ahci Dec 16 12:53:11.186970 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 38 lpm-pol 1 Dec 16 12:53:11.186978 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 38 lpm-pol 1 Dec 16 12:53:11.186985 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 38 lpm-pol 1 Dec 16 12:53:11.186992 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 38 lpm-pol 1 Dec 16 12:53:11.187001 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 38 lpm-pol 1 Dec 16 12:53:11.187008 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 38 lpm-pol 1 Dec 16 12:53:11.187109 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 12:53:11.187121 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 12:53:11.187128 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Dec 16 12:53:11.187135 kernel: ata3: SATA link down (SStatus 0 SControl 300) Dec 16 12:53:11.187143 kernel: ata2: SATA link down (SStatus 0 SControl 300) Dec 16 12:53:11.187152 kernel: ata1.00: LPM support broken, forcing max_power Dec 16 12:53:11.187159 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Dec 16 12:53:11.187165 kernel: ata1.00: applying bridge limits Dec 16 12:53:11.187173 kernel: ata4: SATA link down (SStatus 0 SControl 300) Dec 16 12:53:11.187179 kernel: ata5: SATA link down (SStatus 0 SControl 300) Dec 16 12:53:11.187186 kernel: ata1.00: LPM support broken, forcing max_power Dec 16 12:53:11.187193 kernel: ata1.00: configured for UDMA/100 Dec 16 12:53:11.187201 kernel: ata6: SATA link down (SStatus 0 SControl 300) Dec 16 12:53:11.187296 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 16 12:53:11.187307 kernel: usbcore: registered new interface driver usbhid Dec 16 12:53:11.187314 kernel: usbhid: USB HID core driver Dec 16 12:53:11.187400 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Dec 16 12:53:11.187486 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Dec 16 12:53:11.187498 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 16 12:53:11.187587 kernel: scsi host6: Virtio SCSI HBA Dec 16 12:53:11.187715 kernel: scsi 6:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Dec 16 12:53:11.187811 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Dec 16 12:53:11.187821 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Dec 16 12:53:11.187950 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 12:53:11.188046 kernel: sd 6:0:0:0: Power-on or device reset occurred Dec 16 12:53:11.188133 kernel: sd 6:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Dec 16 12:53:11.188216 kernel: sd 6:0:0:0: [sda] Write Protect is off Dec 16 12:53:11.188300 kernel: sd 6:0:0:0: [sda] Mode Sense: 63 00 00 08 Dec 16 12:53:11.188385 kernel: sd 6:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Dec 16 12:53:11.188397 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 12:53:11.188405 kernel: GPT:25804799 != 80003071 Dec 16 12:53:11.188412 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 12:53:11.188418 kernel: GPT:25804799 != 80003071 Dec 16 12:53:11.188425 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 12:53:11.188432 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 16 12:53:11.188518 kernel: sd 6:0:0:0: [sda] Attached SCSI disk Dec 16 12:53:11.188530 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 12:53:11.188537 kernel: device-mapper: uevent: version 1.0.3 Dec 16 12:53:11.188544 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 12:53:11.188552 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Dec 16 12:53:11.188559 kernel: raid6: avx2x4 gen() 33309 MB/s Dec 16 12:53:11.188566 kernel: raid6: avx2x2 gen() 39265 MB/s Dec 16 12:53:11.188572 kernel: raid6: avx2x1 gen() 31236 MB/s Dec 16 12:53:11.188580 kernel: raid6: using algorithm avx2x2 gen() 39265 MB/s Dec 16 12:53:11.188587 kernel: raid6: .... xor() 31864 MB/s, rmw enabled Dec 16 12:53:11.188594 kernel: raid6: using avx2x2 recovery algorithm Dec 16 12:53:11.188601 kernel: xor: automatically using best checksumming function avx Dec 16 12:53:11.188608 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 12:53:11.188615 kernel: BTRFS: device fsid ea73a94a-fb20-4d45-8448-4c6f4c422a4f devid 1 transid 35 /dev/mapper/usr (254:0) scanned by mount (185) Dec 16 12:53:11.188622 kernel: BTRFS info (device dm-0): first mount of filesystem ea73a94a-fb20-4d45-8448-4c6f4c422a4f Dec 16 12:53:11.188630 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 16 12:53:11.188637 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 16 12:53:11.188657 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 12:53:11.188664 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 12:53:11.188671 kernel: loop: module loaded Dec 16 12:53:11.188678 kernel: loop0: detected capacity change from 0 to 100136 Dec 16 12:53:11.188685 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 12:53:11.190283 systemd[1]: Successfully made /usr/ read-only. Dec 16 12:53:11.190303 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:53:11.190312 systemd[1]: Detected virtualization kvm. Dec 16 12:53:11.190320 systemd[1]: Detected architecture x86-64. Dec 16 12:53:11.190327 systemd[1]: Running in initrd. Dec 16 12:53:11.190335 systemd[1]: No hostname configured, using default hostname. Dec 16 12:53:11.190345 systemd[1]: Hostname set to . Dec 16 12:53:11.190352 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:53:11.190360 systemd[1]: Queued start job for default target initrd.target. Dec 16 12:53:11.190368 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:53:11.190375 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:53:11.190383 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:53:11.190392 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 12:53:11.190400 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:53:11.190408 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 12:53:11.190415 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 12:53:11.190423 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:53:11.190431 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:53:11.190439 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:53:11.190447 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:53:11.190454 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:53:11.190462 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:53:11.190469 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:53:11.190476 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:53:11.190484 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:53:11.190493 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:53:11.190500 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 12:53:11.190508 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 12:53:11.190515 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:53:11.190522 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:53:11.190530 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:53:11.190538 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:53:11.190547 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 12:53:11.190554 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 12:53:11.190562 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:53:11.190569 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 12:53:11.190577 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 12:53:11.190585 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 12:53:11.190593 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:53:11.190600 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:53:11.190608 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:53:11.190616 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 12:53:11.190624 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:53:11.190632 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 12:53:11.190640 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:53:11.190688 systemd-journald[322]: Collecting audit messages is enabled. Dec 16 12:53:11.190747 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 12:53:11.190755 kernel: Bridge firewalling registered Dec 16 12:53:11.190763 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:53:11.190771 kernel: audit: type=1130 audit(1765889591.179:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.190779 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:53:11.190787 systemd-journald[322]: Journal started Dec 16 12:53:11.190807 systemd-journald[322]: Runtime Journal (/run/log/journal/d8696559cc88466d86ea033c7672c3f4) is 4.7M, max 38.2M, 33.4M free. Dec 16 12:53:11.179000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.173882 systemd-modules-load[323]: Inserted module 'br_netfilter' Dec 16 12:53:11.198061 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:53:11.275000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.281737 kernel: audit: type=1130 audit(1765889591.275:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.281728 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:53:11.290913 kernel: audit: type=1130 audit(1765889591.281:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.284390 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:53:11.300108 kernel: audit: type=1130 audit(1765889591.290:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.290000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.291823 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:53:11.307902 kernel: audit: type=1130 audit(1765889591.300:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.300000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.304810 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 12:53:11.310000 audit: BPF prog-id=6 op=LOAD Dec 16 12:53:11.314718 kernel: audit: type=1334 audit(1765889591.310:7): prog-id=6 op=LOAD Dec 16 12:53:11.314853 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:53:11.317796 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:53:11.320317 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:53:11.330349 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:53:11.337954 kernel: audit: type=1130 audit(1765889591.330:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.330000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.339265 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 12:53:11.345325 systemd-tmpfiles[349]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 12:53:11.351283 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:53:11.365283 kernel: audit: type=1130 audit(1765889591.351:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.365306 kernel: audit: type=1130 audit(1765889591.364:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.351000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.364000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.354594 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:53:11.381742 dracut-cmdline[360]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=4dd8de2ff094d97322e7371b16ddee5fc8348868bcdd9ec7bcd11ea9d3933fee Dec 16 12:53:11.385896 systemd-resolved[344]: Positive Trust Anchors: Dec 16 12:53:11.385907 systemd-resolved[344]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:53:11.385910 systemd-resolved[344]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:53:11.385934 systemd-resolved[344]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:53:11.414231 systemd-resolved[344]: Defaulting to hostname 'linux'. Dec 16 12:53:11.415000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.415164 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:53:11.416366 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:53:11.471740 kernel: Loading iSCSI transport class v2.0-870. Dec 16 12:53:11.485737 kernel: iscsi: registered transport (tcp) Dec 16 12:53:11.505449 kernel: iscsi: registered transport (qla4xxx) Dec 16 12:53:11.505518 kernel: QLogic iSCSI HBA Driver Dec 16 12:53:11.527499 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:53:11.544119 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:53:11.544000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.545147 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:53:11.580773 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 12:53:11.580000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.582761 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 12:53:11.585815 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 12:53:11.616486 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:53:11.616000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.618000 audit: BPF prog-id=7 op=LOAD Dec 16 12:53:11.619000 audit: BPF prog-id=8 op=LOAD Dec 16 12:53:11.621814 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:53:11.645589 systemd-udevd[606]: Using default interface naming scheme 'v257'. Dec 16 12:53:11.653881 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:53:11.654000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.656585 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 12:53:11.664948 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:53:11.666000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.666000 audit: BPF prog-id=9 op=LOAD Dec 16 12:53:11.669825 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:53:11.677634 dracut-pre-trigger[690]: rd.md=0: removing MD RAID activation Dec 16 12:53:11.699000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.699389 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:53:11.702820 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:53:11.710799 systemd-networkd[699]: lo: Link UP Dec 16 12:53:11.711563 systemd-networkd[699]: lo: Gained carrier Dec 16 12:53:11.712000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.712020 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:53:11.713444 systemd[1]: Reached target network.target - Network. Dec 16 12:53:11.756788 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:53:11.757000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.759386 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 12:53:11.864313 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Dec 16 12:53:11.885964 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 12:53:11.909031 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Dec 16 12:53:11.909210 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Dec 16 12:53:11.918711 kernel: cryptd: max_cpu_qlen set to 1000 Dec 16 12:53:11.923912 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Dec 16 12:53:11.926310 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 12:53:11.935682 systemd-networkd[699]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:53:11.935689 systemd-networkd[699]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:53:11.944903 kernel: AES CTR mode by8 optimization enabled Dec 16 12:53:11.942043 systemd-networkd[699]: eth0: Link UP Dec 16 12:53:11.942217 systemd-networkd[699]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:53:11.942220 systemd-networkd[699]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:53:11.943057 systemd-networkd[699]: eth0: Gained carrier Dec 16 12:53:11.943066 systemd-networkd[699]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:53:11.948962 systemd-networkd[699]: eth1: Link UP Dec 16 12:53:11.949864 systemd-networkd[699]: eth1: Gained carrier Dec 16 12:53:11.949873 systemd-networkd[699]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:53:11.957274 disk-uuid[793]: Primary Header is updated. Dec 16 12:53:11.957274 disk-uuid[793]: Secondary Entries is updated. Dec 16 12:53:11.957274 disk-uuid[793]: Secondary Header is updated. Dec 16 12:53:11.963300 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:53:11.968000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:11.965277 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:53:11.969003 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:53:11.975249 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:53:11.993752 systemd-networkd[699]: eth0: DHCPv4 address 77.42.41.174/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 12:53:11.994979 systemd-networkd[699]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 12:53:12.046038 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 12:53:12.118000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:12.121519 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:53:12.121000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:12.137461 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:53:12.138206 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:53:12.139908 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:53:12.143458 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 12:53:12.166792 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:53:12.166000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.050873 disk-uuid[816]: Warning: The kernel is still using the old partition table. Dec 16 12:53:13.050873 disk-uuid[816]: The new table will be used at the next reboot or after you Dec 16 12:53:13.050873 disk-uuid[816]: run partprobe(8) or kpartx(8) Dec 16 12:53:13.050873 disk-uuid[816]: The operation has completed successfully. Dec 16 12:53:13.059496 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 12:53:13.059724 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 12:53:13.079571 kernel: kauditd_printk_skb: 16 callbacks suppressed Dec 16 12:53:13.079606 kernel: audit: type=1130 audit(1765889593.060:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.079619 kernel: audit: type=1131 audit(1765889593.060:28): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.060000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.060000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.063571 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 12:53:13.108728 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (863) Dec 16 12:53:13.115856 kernel: BTRFS info (device sda6): first mount of filesystem c87e2a2e-b8fc-4d1d-98f3-593ea9a0f098 Dec 16 12:53:13.115976 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 12:53:13.124077 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:53:13.124146 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:53:13.124160 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:53:13.142748 kernel: BTRFS info (device sda6): last unmount of filesystem c87e2a2e-b8fc-4d1d-98f3-593ea9a0f098 Dec 16 12:53:13.143576 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 12:53:13.152602 kernel: audit: type=1130 audit(1765889593.143:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.143000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.146883 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 12:53:13.192587 systemd-networkd[699]: eth1: Gained IPv6LL Dec 16 12:53:13.278233 ignition[882]: Ignition 2.22.0 Dec 16 12:53:13.279112 ignition[882]: Stage: fetch-offline Dec 16 12:53:13.279155 ignition[882]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:53:13.281681 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:53:13.289862 kernel: audit: type=1130 audit(1765889593.282:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.282000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.279164 ignition[882]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:53:13.284418 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 12:53:13.279250 ignition[882]: parsed url from cmdline: "" Dec 16 12:53:13.279253 ignition[882]: no config URL provided Dec 16 12:53:13.279257 ignition[882]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:53:13.279264 ignition[882]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:53:13.279267 ignition[882]: failed to fetch config: resource requires networking Dec 16 12:53:13.279423 ignition[882]: Ignition finished successfully Dec 16 12:53:13.310732 ignition[889]: Ignition 2.22.0 Dec 16 12:53:13.310746 ignition[889]: Stage: fetch Dec 16 12:53:13.310882 ignition[889]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:53:13.310891 ignition[889]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:53:13.310968 ignition[889]: parsed url from cmdline: "" Dec 16 12:53:13.310971 ignition[889]: no config URL provided Dec 16 12:53:13.310975 ignition[889]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:53:13.310982 ignition[889]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:53:13.311014 ignition[889]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Dec 16 12:53:13.315941 ignition[889]: GET result: OK Dec 16 12:53:13.316007 ignition[889]: parsing config with SHA512: 24f2a03d7a316eee7c00d485de038f58026b8dbf241fe57c591a59afbec2e23efe2c017caaca03064e8d819e9ef3a7b1cbb687df08899a8cf066617c5b6702ec Dec 16 12:53:13.324873 unknown[889]: fetched base config from "system" Dec 16 12:53:13.325192 ignition[889]: fetch: fetch complete Dec 16 12:53:13.324881 unknown[889]: fetched base config from "system" Dec 16 12:53:13.325196 ignition[889]: fetch: fetch passed Dec 16 12:53:13.324885 unknown[889]: fetched user config from "hetzner" Dec 16 12:53:13.335342 kernel: audit: type=1130 audit(1765889593.327:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.327000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.325236 ignition[889]: Ignition finished successfully Dec 16 12:53:13.327453 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 12:53:13.329427 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 12:53:13.357021 ignition[896]: Ignition 2.22.0 Dec 16 12:53:13.357033 ignition[896]: Stage: kargs Dec 16 12:53:13.357149 ignition[896]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:53:13.359478 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 12:53:13.367347 kernel: audit: type=1130 audit(1765889593.360:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.360000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.357157 ignition[896]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:53:13.363817 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 12:53:13.357743 ignition[896]: kargs: kargs passed Dec 16 12:53:13.357803 ignition[896]: Ignition finished successfully Dec 16 12:53:13.387711 ignition[903]: Ignition 2.22.0 Dec 16 12:53:13.387722 ignition[903]: Stage: disks Dec 16 12:53:13.387835 ignition[903]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:53:13.389759 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 12:53:13.399183 kernel: audit: type=1130 audit(1765889593.390:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.390000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.387842 ignition[903]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:53:13.391580 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 12:53:13.388749 ignition[903]: disks: disks passed Dec 16 12:53:13.399864 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 12:53:13.388784 ignition[903]: Ignition finished successfully Dec 16 12:53:13.401514 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:53:13.403120 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:53:13.404466 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:53:13.406851 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 12:53:13.451799 systemd-fsck[912]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Dec 16 12:53:13.454338 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 12:53:13.464857 kernel: audit: type=1130 audit(1765889593.454:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.454000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.456725 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 12:53:13.569752 kernel: EXT4-fs (sda9): mounted filesystem 7cac6192-738c-43cc-9341-24f71d091e91 r/w with ordered data mode. Quota mode: none. Dec 16 12:53:13.570050 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 12:53:13.571258 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 12:53:13.574198 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:53:13.577771 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 12:53:13.581824 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 16 12:53:13.582729 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 12:53:13.582765 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:53:13.591840 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 12:53:13.595100 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 12:53:13.604009 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (920) Dec 16 12:53:13.608865 kernel: BTRFS info (device sda6): first mount of filesystem c87e2a2e-b8fc-4d1d-98f3-593ea9a0f098 Dec 16 12:53:13.613741 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 12:53:13.634732 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:53:13.634799 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:53:13.634811 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:53:13.641678 systemd-networkd[699]: eth0: Gained IPv6LL Dec 16 12:53:13.644381 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:53:13.676047 initrd-setup-root[947]: cut: /sysroot/etc/passwd: No such file or directory Dec 16 12:53:13.677332 coreos-metadata[922]: Dec 16 12:53:13.677 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Dec 16 12:53:13.679854 coreos-metadata[922]: Dec 16 12:53:13.678 INFO Fetch successful Dec 16 12:53:13.681451 coreos-metadata[922]: Dec 16 12:53:13.680 INFO wrote hostname ci-4515-1-0-8-2e3d7ab7bb to /sysroot/etc/hostname Dec 16 12:53:13.681709 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 12:53:13.690941 kernel: audit: type=1130 audit(1765889593.682:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.682000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.690997 initrd-setup-root[955]: cut: /sysroot/etc/group: No such file or directory Dec 16 12:53:13.693672 initrd-setup-root[962]: cut: /sysroot/etc/shadow: No such file or directory Dec 16 12:53:13.697895 initrd-setup-root[969]: cut: /sysroot/etc/gshadow: No such file or directory Dec 16 12:53:13.778254 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 12:53:13.787940 kernel: audit: type=1130 audit(1765889593.778:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.778000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.781799 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 12:53:13.793472 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 12:53:13.798105 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 12:53:13.809789 kernel: BTRFS info (device sda6): last unmount of filesystem c87e2a2e-b8fc-4d1d-98f3-593ea9a0f098 Dec 16 12:53:13.830673 ignition[1036]: INFO : Ignition 2.22.0 Dec 16 12:53:13.830673 ignition[1036]: INFO : Stage: mount Dec 16 12:53:13.830673 ignition[1036]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:53:13.830673 ignition[1036]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:53:13.830673 ignition[1036]: INFO : mount: mount passed Dec 16 12:53:13.830673 ignition[1036]: INFO : Ignition finished successfully Dec 16 12:53:13.833000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.834000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:13.832987 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 12:53:13.834154 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 12:53:13.836877 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 12:53:14.571791 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:53:14.605752 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1049) Dec 16 12:53:14.613173 kernel: BTRFS info (device sda6): first mount of filesystem c87e2a2e-b8fc-4d1d-98f3-593ea9a0f098 Dec 16 12:53:14.613272 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 12:53:14.623385 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:53:14.623445 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:53:14.623467 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:53:14.629928 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:53:14.668031 ignition[1066]: INFO : Ignition 2.22.0 Dec 16 12:53:14.668031 ignition[1066]: INFO : Stage: files Dec 16 12:53:14.669913 ignition[1066]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:53:14.669913 ignition[1066]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:53:14.672303 ignition[1066]: DEBUG : files: compiled without relabeling support, skipping Dec 16 12:53:14.672303 ignition[1066]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 12:53:14.672303 ignition[1066]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 12:53:14.677331 ignition[1066]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 12:53:14.678452 ignition[1066]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 12:53:14.679761 ignition[1066]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 12:53:14.678553 unknown[1066]: wrote ssh authorized keys file for user: core Dec 16 12:53:14.681986 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 16 12:53:14.683298 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Dec 16 12:53:15.066777 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 12:53:15.375657 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 16 12:53:15.375657 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:53:15.378857 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 12:53:15.391202 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 12:53:15.391202 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 12:53:15.391202 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Dec 16 12:53:15.809232 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 12:53:16.085847 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 12:53:16.085847 ignition[1066]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 12:53:16.088653 ignition[1066]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:53:16.090390 ignition[1066]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:53:16.090390 ignition[1066]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 12:53:16.090390 ignition[1066]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Dec 16 12:53:16.096311 ignition[1066]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 12:53:16.096311 ignition[1066]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 12:53:16.096311 ignition[1066]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Dec 16 12:53:16.096311 ignition[1066]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Dec 16 12:53:16.096311 ignition[1066]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 12:53:16.096311 ignition[1066]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:53:16.096311 ignition[1066]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:53:16.096311 ignition[1066]: INFO : files: files passed Dec 16 12:53:16.096311 ignition[1066]: INFO : Ignition finished successfully Dec 16 12:53:16.097000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.096544 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 12:53:16.100852 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 12:53:16.104835 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 12:53:16.116113 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 12:53:16.117031 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 12:53:16.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.119000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.123449 initrd-setup-root-after-ignition[1098]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:53:16.124748 initrd-setup-root-after-ignition[1102]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:53:16.126515 initrd-setup-root-after-ignition[1098]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:53:16.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.128011 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:53:16.129446 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 12:53:16.131152 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 12:53:16.177482 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 12:53:16.177598 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 12:53:16.178000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.178000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.179714 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 12:53:16.181166 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 12:53:16.183140 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 12:53:16.184005 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 12:53:16.215186 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:53:16.215000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.217860 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 12:53:16.238408 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:53:16.238639 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:53:16.240522 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:53:16.242428 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 12:53:16.251735 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 12:53:16.252000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.251904 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:53:16.253884 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 12:53:16.255028 systemd[1]: Stopped target basic.target - Basic System. Dec 16 12:53:16.256798 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 12:53:16.258393 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:53:16.260141 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 12:53:16.262297 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:53:16.264461 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 12:53:16.266485 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:53:16.268839 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 12:53:16.270866 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 12:53:16.273225 systemd[1]: Stopped target swap.target - Swaps. Dec 16 12:53:16.275000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.275136 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 12:53:16.275335 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:53:16.277109 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:53:16.278331 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:53:16.283000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.279974 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 12:53:16.285000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.280483 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:53:16.286000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.281894 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 12:53:16.288000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.282099 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 12:53:16.284513 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 12:53:16.284727 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:53:16.285840 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 12:53:16.294000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.285982 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 12:53:16.287459 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 16 12:53:16.299000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.287606 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 12:53:16.290328 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 12:53:16.305000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.291996 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 12:53:16.293817 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:53:16.308000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.296880 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 12:53:16.298532 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 12:53:16.298738 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:53:16.300417 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 12:53:16.300558 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:53:16.305882 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 12:53:16.319000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.319000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.306054 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:53:16.321000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.324117 ignition[1122]: INFO : Ignition 2.22.0 Dec 16 12:53:16.324117 ignition[1122]: INFO : Stage: umount Dec 16 12:53:16.324117 ignition[1122]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:53:16.324117 ignition[1122]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:53:16.324117 ignition[1122]: INFO : umount: umount passed Dec 16 12:53:16.324117 ignition[1122]: INFO : Ignition finished successfully Dec 16 12:53:16.324000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.327000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.332000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.317545 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 12:53:16.317646 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 12:53:16.321829 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 12:53:16.321898 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 12:53:16.322970 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 12:53:16.323022 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 12:53:16.325778 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 12:53:16.325824 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 12:53:16.328510 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 12:53:16.328564 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 12:53:16.332842 systemd[1]: Stopped target network.target - Network. Dec 16 12:53:16.334035 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 12:53:16.334087 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:53:16.351000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.335008 systemd[1]: Stopped target paths.target - Path Units. Dec 16 12:53:16.353000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.336818 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 12:53:16.341940 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:53:16.343135 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 12:53:16.344538 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 12:53:16.345976 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 12:53:16.346012 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:53:16.347384 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 12:53:16.347415 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:53:16.363000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.349114 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Dec 16 12:53:16.349138 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:53:16.366000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.350520 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 12:53:16.350566 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 12:53:16.352156 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 12:53:16.352196 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 12:53:16.353839 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 12:53:16.371000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.355546 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 12:53:16.373000 audit: BPF prog-id=6 op=UNLOAD Dec 16 12:53:16.360241 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 12:53:16.361091 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 12:53:16.361227 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 12:53:16.364552 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 12:53:16.364648 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 12:53:16.377000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.370129 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 12:53:16.370253 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 12:53:16.376329 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 12:53:16.379000 audit: BPF prog-id=9 op=UNLOAD Dec 16 12:53:16.376441 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 12:53:16.380336 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 12:53:16.382580 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 12:53:16.382624 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:53:16.385381 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 12:53:16.387326 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 12:53:16.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.387393 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:53:16.398000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.397649 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 12:53:16.400000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.397772 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:53:16.399088 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 12:53:16.399143 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 12:53:16.400962 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:53:16.418920 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 12:53:16.420235 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:53:16.420000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.422553 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 12:53:16.422625 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 12:53:16.423899 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 12:53:16.427000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.423937 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:53:16.429000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.426201 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 12:53:16.431000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.426266 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:53:16.428790 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 12:53:16.438000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.428844 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 12:53:16.430585 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 12:53:16.440000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.430639 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:53:16.433197 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 12:53:16.442000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.435333 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 12:53:16.435398 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:53:16.438788 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 12:53:16.438846 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:53:16.446000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.440878 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:53:16.440971 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:53:16.443834 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 12:53:16.445822 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 12:53:16.453565 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 12:53:16.453739 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 12:53:16.455000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.455000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:16.456152 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 12:53:16.458624 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 12:53:16.482296 systemd[1]: Switching root. Dec 16 12:53:16.519692 systemd-journald[322]: Journal stopped Dec 16 12:53:17.620775 systemd-journald[322]: Received SIGTERM from PID 1 (systemd). Dec 16 12:53:17.620827 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 12:53:17.620839 kernel: SELinux: policy capability open_perms=1 Dec 16 12:53:17.620850 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 12:53:17.620861 kernel: SELinux: policy capability always_check_network=0 Dec 16 12:53:17.620870 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 12:53:17.620880 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 12:53:17.620891 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 12:53:17.620901 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 12:53:17.620911 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 12:53:17.620922 systemd[1]: Successfully loaded SELinux policy in 75.102ms. Dec 16 12:53:17.620938 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.923ms. Dec 16 12:53:17.620948 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:53:17.620958 systemd[1]: Detected virtualization kvm. Dec 16 12:53:17.620968 systemd[1]: Detected architecture x86-64. Dec 16 12:53:17.620977 systemd[1]: Detected first boot. Dec 16 12:53:17.620989 systemd[1]: Hostname set to . Dec 16 12:53:17.620998 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:53:17.621007 zram_generator::config[1165]: No configuration found. Dec 16 12:53:17.621019 kernel: Guest personality initialized and is inactive Dec 16 12:53:17.621029 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 16 12:53:17.621037 kernel: Initialized host personality Dec 16 12:53:17.621046 kernel: NET: Registered PF_VSOCK protocol family Dec 16 12:53:17.621057 systemd[1]: Populated /etc with preset unit settings. Dec 16 12:53:17.621066 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 12:53:17.621075 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 12:53:17.621084 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 12:53:17.621098 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 12:53:17.621107 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 12:53:17.621116 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 12:53:17.621128 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 12:53:17.621144 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 12:53:17.621159 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 12:53:17.621171 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 12:53:17.621180 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 12:53:17.621189 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:53:17.621199 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:53:17.621210 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 12:53:17.621221 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 12:53:17.621230 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 12:53:17.621244 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:53:17.621262 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 16 12:53:17.621279 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:53:17.621294 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:53:17.621304 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 12:53:17.621312 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 12:53:17.621322 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 12:53:17.621331 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 12:53:17.621340 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:53:17.621351 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:53:17.621361 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Dec 16 12:53:17.621370 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:53:17.621379 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:53:17.621388 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 12:53:17.621397 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 12:53:17.621406 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 12:53:17.621417 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:53:17.621426 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Dec 16 12:53:17.621435 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:53:17.621444 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Dec 16 12:53:17.621453 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Dec 16 12:53:17.621463 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:53:17.621472 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:53:17.621482 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 12:53:17.621491 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 12:53:17.621500 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 12:53:17.621510 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 12:53:17.621520 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:53:17.621529 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 12:53:17.621538 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 12:53:17.621548 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 12:53:17.621559 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 12:53:17.621568 systemd[1]: Reached target machines.target - Containers. Dec 16 12:53:17.621577 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 12:53:17.621587 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:53:17.621596 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:53:17.621605 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 12:53:17.621616 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:53:17.621624 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:53:17.621633 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:53:17.621642 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 12:53:17.621651 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:53:17.621661 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 12:53:17.621683 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 12:53:17.621693 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 12:53:17.623748 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 12:53:17.623761 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 12:53:17.623775 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:53:17.623785 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:53:17.623795 kernel: fuse: init (API version 7.41) Dec 16 12:53:17.623806 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:53:17.623815 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:53:17.623825 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 12:53:17.623835 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 12:53:17.623846 kernel: ACPI: bus type drm_connector registered Dec 16 12:53:17.623855 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:53:17.623865 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:53:17.623874 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 12:53:17.623883 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 12:53:17.623892 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 12:53:17.623918 systemd-journald[1257]: Collecting audit messages is enabled. Dec 16 12:53:17.623945 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 12:53:17.623956 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 12:53:17.623967 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 12:53:17.623978 systemd-journald[1257]: Journal started Dec 16 12:53:17.623997 systemd-journald[1257]: Runtime Journal (/run/log/journal/d8696559cc88466d86ea033c7672c3f4) is 4.7M, max 38.2M, 33.4M free. Dec 16 12:53:17.398000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Dec 16 12:53:17.535000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.541000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.546000 audit: BPF prog-id=14 op=UNLOAD Dec 16 12:53:17.546000 audit: BPF prog-id=13 op=UNLOAD Dec 16 12:53:17.547000 audit: BPF prog-id=15 op=LOAD Dec 16 12:53:17.547000 audit: BPF prog-id=16 op=LOAD Dec 16 12:53:17.547000 audit: BPF prog-id=17 op=LOAD Dec 16 12:53:17.618000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Dec 16 12:53:17.618000 audit[1257]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=5 a1=7fff0007f460 a2=4000 a3=0 items=0 ppid=1 pid=1257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:17.618000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Dec 16 12:53:17.294097 systemd[1]: Queued start job for default target multi-user.target. Dec 16 12:53:17.302692 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 16 12:53:17.303146 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 12:53:17.626753 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 12:53:17.626000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.628839 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:53:17.628000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.630436 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:53:17.630000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.631400 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 12:53:17.631608 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 12:53:17.631000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.631000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.632587 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:53:17.632822 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:53:17.632000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.632000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.633871 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:53:17.633983 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:53:17.633000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.634000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.634988 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:53:17.635165 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:53:17.635000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.635000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.636197 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 12:53:17.636363 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 12:53:17.636000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.636000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.637272 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:53:17.637379 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:53:17.637000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.637000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.638476 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:53:17.638000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.639557 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:53:17.641122 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 12:53:17.642215 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 12:53:17.642000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.651241 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:53:17.652751 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Dec 16 12:53:17.655773 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 12:53:17.659764 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 12:53:17.660790 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 12:53:17.660815 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:53:17.663435 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 12:53:17.665862 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:53:17.665968 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:53:17.668951 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 12:53:17.670813 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 12:53:17.671934 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:53:17.678800 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 12:53:17.679502 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:53:17.680207 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:53:17.681814 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 12:53:17.687000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.689565 systemd-journald[1257]: Time spent on flushing to /var/log/journal/d8696559cc88466d86ea033c7672c3f4 is 56.466ms for 1289 entries. Dec 16 12:53:17.689565 systemd-journald[1257]: System Journal (/var/log/journal/d8696559cc88466d86ea033c7672c3f4) is 8M, max 588.1M, 580.1M free. Dec 16 12:53:17.765823 systemd-journald[1257]: Received client request to flush runtime journal. Dec 16 12:53:17.765860 kernel: loop1: detected capacity change from 0 to 111544 Dec 16 12:53:17.765877 kernel: loop2: detected capacity change from 0 to 119256 Dec 16 12:53:17.712000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.730000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.684789 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 12:53:17.687813 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:53:17.688769 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 12:53:17.691849 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 12:53:17.711751 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 12:53:17.713367 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 12:53:17.722235 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 12:53:17.730614 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:53:17.769849 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 12:53:17.773000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.774595 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 12:53:17.780000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.782063 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 12:53:17.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.785000 audit: BPF prog-id=18 op=LOAD Dec 16 12:53:17.785000 audit: BPF prog-id=19 op=LOAD Dec 16 12:53:17.785000 audit: BPF prog-id=20 op=LOAD Dec 16 12:53:17.788000 audit: BPF prog-id=21 op=LOAD Dec 16 12:53:17.787321 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Dec 16 12:53:17.789806 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:53:17.791465 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:53:17.805000 audit: BPF prog-id=22 op=LOAD Dec 16 12:53:17.805000 audit: BPF prog-id=23 op=LOAD Dec 16 12:53:17.805000 audit: BPF prog-id=24 op=LOAD Dec 16 12:53:17.808000 audit: BPF prog-id=25 op=LOAD Dec 16 12:53:17.808000 audit: BPF prog-id=26 op=LOAD Dec 16 12:53:17.808000 audit: BPF prog-id=27 op=LOAD Dec 16 12:53:17.808181 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Dec 16 12:53:17.809923 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 12:53:17.813756 kernel: loop3: detected capacity change from 0 to 224512 Dec 16 12:53:17.827016 systemd-tmpfiles[1311]: ACLs are not supported, ignoring. Dec 16 12:53:17.827376 systemd-tmpfiles[1311]: ACLs are not supported, ignoring. Dec 16 12:53:17.837088 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:53:17.838000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.847720 kernel: loop4: detected capacity change from 0 to 8 Dec 16 12:53:17.864803 systemd-nsresourced[1313]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Dec 16 12:53:17.865564 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Dec 16 12:53:17.866000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.868167 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 12:53:17.872781 kernel: loop5: detected capacity change from 0 to 111544 Dec 16 12:53:17.870000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.895729 kernel: loop6: detected capacity change from 0 to 119256 Dec 16 12:53:17.912717 kernel: loop7: detected capacity change from 0 to 224512 Dec 16 12:53:17.939713 kernel: loop1: detected capacity change from 0 to 8 Dec 16 12:53:17.947266 (sd-merge)[1324]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Dec 16 12:53:17.955796 (sd-merge)[1324]: Merged extensions into '/usr'. Dec 16 12:53:17.959185 systemd-resolved[1310]: Positive Trust Anchors: Dec 16 12:53:17.960030 systemd-oomd[1309]: No swap; memory pressure usage will be degraded Dec 16 12:53:17.961636 systemd-resolved[1310]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:53:17.961734 systemd-resolved[1310]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:53:17.961857 systemd-resolved[1310]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:53:17.961923 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Dec 16 12:53:17.962000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:17.963958 systemd[1]: Reload requested from client PID 1291 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 12:53:17.963966 systemd[1]: Reloading... Dec 16 12:53:17.982964 systemd-resolved[1310]: Using system hostname 'ci-4515-1-0-8-2e3d7ab7bb'. Dec 16 12:53:18.027721 zram_generator::config[1359]: No configuration found. Dec 16 12:53:18.180357 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 12:53:18.180511 systemd[1]: Reloading finished in 216 ms. Dec 16 12:53:18.211369 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:53:18.211000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.212307 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 12:53:18.213752 kernel: kauditd_printk_skb: 113 callbacks suppressed Dec 16 12:53:18.213788 kernel: audit: type=1130 audit(1765889598.211:148): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.219000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.220623 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 12:53:18.221846 kernel: audit: type=1130 audit(1765889598.219:149): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.226000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.229329 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:53:18.233750 kernel: audit: type=1130 audit(1765889598.226:150): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.251896 systemd[1]: Starting ensure-sysext.service... Dec 16 12:53:18.255000 audit: BPF prog-id=8 op=UNLOAD Dec 16 12:53:18.253540 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:53:18.258139 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:53:18.258898 kernel: audit: type=1334 audit(1765889598.255:151): prog-id=8 op=UNLOAD Dec 16 12:53:18.255000 audit: BPF prog-id=7 op=UNLOAD Dec 16 12:53:18.263202 kernel: audit: type=1334 audit(1765889598.255:152): prog-id=7 op=UNLOAD Dec 16 12:53:18.263276 kernel: audit: type=1334 audit(1765889598.255:153): prog-id=28 op=LOAD Dec 16 12:53:18.255000 audit: BPF prog-id=28 op=LOAD Dec 16 12:53:18.255000 audit: BPF prog-id=29 op=LOAD Dec 16 12:53:18.259000 audit: BPF prog-id=30 op=LOAD Dec 16 12:53:18.259000 audit: BPF prog-id=21 op=UNLOAD Dec 16 12:53:18.259000 audit: BPF prog-id=31 op=LOAD Dec 16 12:53:18.259000 audit: BPF prog-id=15 op=UNLOAD Dec 16 12:53:18.260000 audit: BPF prog-id=32 op=LOAD Dec 16 12:53:18.260000 audit: BPF prog-id=33 op=LOAD Dec 16 12:53:18.260000 audit: BPF prog-id=16 op=UNLOAD Dec 16 12:53:18.260000 audit: BPF prog-id=17 op=UNLOAD Dec 16 12:53:18.260000 audit: BPF prog-id=34 op=LOAD Dec 16 12:53:18.260000 audit: BPF prog-id=22 op=UNLOAD Dec 16 12:53:18.260000 audit: BPF prog-id=35 op=LOAD Dec 16 12:53:18.260000 audit: BPF prog-id=36 op=LOAD Dec 16 12:53:18.260000 audit: BPF prog-id=23 op=UNLOAD Dec 16 12:53:18.260000 audit: BPF prog-id=24 op=UNLOAD Dec 16 12:53:18.260000 audit: BPF prog-id=37 op=LOAD Dec 16 12:53:18.260000 audit: BPF prog-id=25 op=UNLOAD Dec 16 12:53:18.260000 audit: BPF prog-id=38 op=LOAD Dec 16 12:53:18.260000 audit: BPF prog-id=39 op=LOAD Dec 16 12:53:18.260000 audit: BPF prog-id=26 op=UNLOAD Dec 16 12:53:18.260000 audit: BPF prog-id=27 op=UNLOAD Dec 16 12:53:18.264872 kernel: audit: type=1334 audit(1765889598.255:154): prog-id=29 op=LOAD Dec 16 12:53:18.264908 kernel: audit: type=1334 audit(1765889598.259:155): prog-id=30 op=LOAD Dec 16 12:53:18.264924 kernel: audit: type=1334 audit(1765889598.259:156): prog-id=21 op=UNLOAD Dec 16 12:53:18.264937 kernel: audit: type=1334 audit(1765889598.259:157): prog-id=31 op=LOAD Dec 16 12:53:18.268000 audit: BPF prog-id=40 op=LOAD Dec 16 12:53:18.268000 audit: BPF prog-id=18 op=UNLOAD Dec 16 12:53:18.268000 audit: BPF prog-id=41 op=LOAD Dec 16 12:53:18.268000 audit: BPF prog-id=42 op=LOAD Dec 16 12:53:18.268000 audit: BPF prog-id=19 op=UNLOAD Dec 16 12:53:18.268000 audit: BPF prog-id=20 op=UNLOAD Dec 16 12:53:18.279302 systemd[1]: Reload requested from client PID 1406 ('systemctl') (unit ensure-sysext.service)... Dec 16 12:53:18.279422 systemd[1]: Reloading... Dec 16 12:53:18.291935 systemd-tmpfiles[1407]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 12:53:18.291967 systemd-tmpfiles[1407]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 12:53:18.292153 systemd-tmpfiles[1407]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 12:53:18.295028 systemd-tmpfiles[1407]: ACLs are not supported, ignoring. Dec 16 12:53:18.295081 systemd-tmpfiles[1407]: ACLs are not supported, ignoring. Dec 16 12:53:18.301385 systemd-tmpfiles[1407]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:53:18.301393 systemd-tmpfiles[1407]: Skipping /boot Dec 16 12:53:18.308141 systemd-udevd[1408]: Using default interface naming scheme 'v257'. Dec 16 12:53:18.309631 systemd-tmpfiles[1407]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:53:18.309644 systemd-tmpfiles[1407]: Skipping /boot Dec 16 12:53:18.355725 zram_generator::config[1451]: No configuration found. Dec 16 12:53:18.454724 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 12:53:18.491742 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Dec 16 12:53:18.525723 kernel: ACPI: button: Power Button [PWRF] Dec 16 12:53:18.554668 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Dec 16 12:53:18.554751 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Dec 16 12:53:18.566724 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Dec 16 12:53:18.566906 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Dec 16 12:53:18.569403 kernel: Console: switching to colour dummy device 80x25 Dec 16 12:53:18.574413 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 12:53:18.574448 kernel: [drm] features: -context_init Dec 16 12:53:18.580201 kernel: [drm] number of scanouts: 1 Dec 16 12:53:18.580239 kernel: [drm] number of cap sets: 0 Dec 16 12:53:18.584713 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 16 12:53:18.590903 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Dec 16 12:53:18.590954 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 12:53:18.599728 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 12:53:18.651628 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 12:53:18.654742 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 16 12:53:18.655172 systemd[1]: Reloading finished in 375 ms. Dec 16 12:53:18.680316 kernel: EDAC MC: Ver: 3.0.0 Dec 16 12:53:18.678214 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:53:18.679000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.681000 audit: BPF prog-id=43 op=LOAD Dec 16 12:53:18.681000 audit: BPF prog-id=34 op=UNLOAD Dec 16 12:53:18.681000 audit: BPF prog-id=44 op=LOAD Dec 16 12:53:18.681000 audit: BPF prog-id=45 op=LOAD Dec 16 12:53:18.682000 audit: BPF prog-id=35 op=UNLOAD Dec 16 12:53:18.682000 audit: BPF prog-id=36 op=UNLOAD Dec 16 12:53:18.682000 audit: BPF prog-id=46 op=LOAD Dec 16 12:53:18.682000 audit: BPF prog-id=47 op=LOAD Dec 16 12:53:18.682000 audit: BPF prog-id=28 op=UNLOAD Dec 16 12:53:18.682000 audit: BPF prog-id=29 op=UNLOAD Dec 16 12:53:18.683000 audit: BPF prog-id=48 op=LOAD Dec 16 12:53:18.685000 audit: BPF prog-id=31 op=UNLOAD Dec 16 12:53:18.685000 audit: BPF prog-id=49 op=LOAD Dec 16 12:53:18.685000 audit: BPF prog-id=50 op=LOAD Dec 16 12:53:18.685000 audit: BPF prog-id=32 op=UNLOAD Dec 16 12:53:18.685000 audit: BPF prog-id=33 op=UNLOAD Dec 16 12:53:18.685000 audit: BPF prog-id=51 op=LOAD Dec 16 12:53:18.686000 audit: BPF prog-id=30 op=UNLOAD Dec 16 12:53:18.687000 audit: BPF prog-id=52 op=LOAD Dec 16 12:53:18.687000 audit: BPF prog-id=37 op=UNLOAD Dec 16 12:53:18.687000 audit: BPF prog-id=53 op=LOAD Dec 16 12:53:18.687000 audit: BPF prog-id=54 op=LOAD Dec 16 12:53:18.687000 audit: BPF prog-id=38 op=UNLOAD Dec 16 12:53:18.687000 audit: BPF prog-id=39 op=UNLOAD Dec 16 12:53:18.689000 audit: BPF prog-id=55 op=LOAD Dec 16 12:53:18.689000 audit: BPF prog-id=40 op=UNLOAD Dec 16 12:53:18.689000 audit: BPF prog-id=56 op=LOAD Dec 16 12:53:18.689000 audit: BPF prog-id=57 op=LOAD Dec 16 12:53:18.689000 audit: BPF prog-id=41 op=UNLOAD Dec 16 12:53:18.689000 audit: BPF prog-id=42 op=UNLOAD Dec 16 12:53:18.693321 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:53:18.695000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.753127 systemd[1]: Finished ensure-sysext.service. Dec 16 12:53:18.755000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.761826 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:53:18.762759 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:53:18.766864 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 12:53:18.767593 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:53:18.770752 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:53:18.772361 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:53:18.774583 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:53:18.777355 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:53:18.777615 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:53:18.777725 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:53:18.780891 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 12:53:18.781770 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 12:53:18.782538 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:53:18.784194 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 12:53:18.785000 audit: BPF prog-id=58 op=LOAD Dec 16 12:53:18.787125 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:53:18.787000 audit: BPF prog-id=59 op=LOAD Dec 16 12:53:18.795971 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 16 12:53:18.798919 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 12:53:18.801400 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:53:18.804000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.805000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.802494 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:53:18.805430 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:53:18.805603 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:53:18.818000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.818000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.819000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.819000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.817586 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:53:18.817770 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:53:18.819458 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:53:18.819601 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:53:18.823234 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:53:18.827000 audit[1551]: SYSTEM_BOOT pid=1551 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.835004 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 12:53:18.836000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.846114 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:53:18.846531 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:53:18.847000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.847000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.852461 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:53:18.871088 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 12:53:18.873000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.886000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:18.886979 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 12:53:18.892000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Dec 16 12:53:18.892000 audit[1580]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd97011c00 a2=420 a3=0 items=0 ppid=1534 pid=1580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:18.892000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:53:18.894296 augenrules[1580]: No rules Dec 16 12:53:18.894476 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:53:18.894903 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:53:18.932263 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 12:53:18.933028 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 12:53:18.935914 systemd-networkd[1544]: lo: Link UP Dec 16 12:53:18.935922 systemd-networkd[1544]: lo: Gained carrier Dec 16 12:53:18.938984 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:53:18.939923 systemd[1]: Reached target network.target - Network. Dec 16 12:53:18.941409 systemd-networkd[1544]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:53:18.941421 systemd-networkd[1544]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:53:18.942161 systemd-networkd[1544]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:53:18.942169 systemd-networkd[1544]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:53:18.942626 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 12:53:18.947003 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 12:53:18.949427 systemd-networkd[1544]: eth0: Link UP Dec 16 12:53:18.949581 systemd-networkd[1544]: eth0: Gained carrier Dec 16 12:53:18.949593 systemd-networkd[1544]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:53:18.951967 systemd-networkd[1544]: eth1: Link UP Dec 16 12:53:18.954872 systemd-networkd[1544]: eth1: Gained carrier Dec 16 12:53:18.954890 systemd-networkd[1544]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:53:18.961940 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:53:18.965528 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 16 12:53:18.967331 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 12:53:18.982568 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 12:53:18.996749 systemd-networkd[1544]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 12:53:18.997783 systemd-timesyncd[1550]: Network configuration changed, trying to establish connection. Dec 16 12:53:19.007809 systemd-networkd[1544]: eth0: DHCPv4 address 77.42.41.174/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 12:53:19.216493 systemd-timesyncd[1550]: Server has too large root distance. Disconnecting. Dec 16 12:53:19.825681 systemd-resolved[1310]: Clock change detected. Flushing caches. Dec 16 12:53:19.826239 systemd-timesyncd[1550]: Contacted time server 141.144.230.32:123 (3.flatcar.pool.ntp.org). Dec 16 12:53:19.826306 systemd-timesyncd[1550]: Initial clock synchronization to Tue 2025-12-16 12:53:19.825617 UTC. Dec 16 12:53:19.956952 ldconfig[1541]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 12:53:19.961466 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 12:53:19.964472 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 12:53:19.985294 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 12:53:19.986487 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:53:19.989502 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 12:53:19.990236 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 12:53:19.992322 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 16 12:53:19.993441 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 12:53:19.995473 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 12:53:19.996831 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Dec 16 12:53:20.002955 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Dec 16 12:53:20.004099 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 12:53:20.004844 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 12:53:20.004884 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:53:20.005695 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:53:20.009818 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 12:53:20.015620 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 12:53:20.020499 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 12:53:20.024859 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 12:53:20.025894 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 12:53:20.037910 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 12:53:20.039298 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 12:53:20.040803 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 12:53:20.042642 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:53:20.043006 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:53:20.045228 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:53:20.045269 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:53:20.046324 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 12:53:20.049112 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 12:53:20.052811 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 12:53:20.055541 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 12:53:20.059515 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 12:53:20.064291 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 12:53:20.066704 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 12:53:20.069703 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 16 12:53:20.077721 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 12:53:20.081198 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 12:53:20.085884 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Dec 16 12:53:20.088411 jq[1605]: false Dec 16 12:53:20.093331 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 12:53:20.096324 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 12:53:20.103088 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Refreshing passwd entry cache Dec 16 12:53:20.103088 oslogin_cache_refresh[1609]: Refreshing passwd entry cache Dec 16 12:53:20.103645 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 12:53:20.109938 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Failure getting users, quitting Dec 16 12:53:20.109938 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 12:53:20.109938 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Refreshing group entry cache Dec 16 12:53:20.108934 oslogin_cache_refresh[1609]: Failure getting users, quitting Dec 16 12:53:20.108948 oslogin_cache_refresh[1609]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 12:53:20.108982 oslogin_cache_refresh[1609]: Refreshing group entry cache Dec 16 12:53:20.110475 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 12:53:20.111409 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 12:53:20.112057 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 12:53:20.115235 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 12:53:20.117553 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 12:53:20.118783 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 12:53:20.119002 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 12:53:20.123011 coreos-metadata[1602]: Dec 16 12:53:20.111 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Dec 16 12:53:20.125964 coreos-metadata[1602]: Dec 16 12:53:20.125 INFO Fetch successful Dec 16 12:53:20.125964 coreos-metadata[1602]: Dec 16 12:53:20.125 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Dec 16 12:53:20.125964 coreos-metadata[1602]: Dec 16 12:53:20.125 INFO Fetch successful Dec 16 12:53:20.124836 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 16 12:53:20.123440 oslogin_cache_refresh[1609]: Failure getting groups, quitting Dec 16 12:53:20.126095 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Failure getting groups, quitting Dec 16 12:53:20.126095 google_oslogin_nss_cache[1609]: oslogin_cache_refresh[1609]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 12:53:20.123450 oslogin_cache_refresh[1609]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 12:53:20.131972 extend-filesystems[1608]: Found /dev/sda6 Dec 16 12:53:20.134945 jq[1620]: true Dec 16 12:53:20.138626 extend-filesystems[1608]: Found /dev/sda9 Dec 16 12:53:20.138626 extend-filesystems[1608]: Checking size of /dev/sda9 Dec 16 12:53:20.139391 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 16 12:53:20.153656 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 12:53:20.153838 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 12:53:20.161490 extend-filesystems[1608]: Resized partition /dev/sda9 Dec 16 12:53:20.171371 extend-filesystems[1650]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 12:53:20.195411 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 8410107 blocks Dec 16 12:53:20.195439 update_engine[1619]: I20251216 12:53:20.170432 1619 main.cc:92] Flatcar Update Engine starting Dec 16 12:53:20.187758 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 12:53:20.188097 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 12:53:20.209503 tar[1641]: linux-amd64/LICENSE Dec 16 12:53:20.209981 tar[1641]: linux-amd64/helm Dec 16 12:53:20.219924 systemd-logind[1616]: New seat seat0. Dec 16 12:53:20.227263 systemd-logind[1616]: Watching system buttons on /dev/input/event3 (Power Button) Dec 16 12:53:20.227278 systemd-logind[1616]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 16 12:53:20.237365 jq[1646]: true Dec 16 12:53:20.227527 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 12:53:20.245962 dbus-daemon[1603]: [system] SELinux support is enabled Dec 16 12:53:20.246247 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 12:53:20.255075 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 12:53:20.255312 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 12:53:20.257734 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 12:53:20.257751 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 12:53:20.268258 dbus-daemon[1603]: [system] Successfully activated service 'org.freedesktop.systemd1' Dec 16 12:53:20.270888 systemd[1]: Started update-engine.service - Update Engine. Dec 16 12:53:20.273818 update_engine[1619]: I20251216 12:53:20.273419 1619 update_check_scheduler.cc:74] Next update check in 4m5s Dec 16 12:53:20.281125 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 12:53:20.292976 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 12:53:20.294809 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 12:53:20.350203 kernel: EXT4-fs (sda9): resized filesystem to 8410107 Dec 16 12:53:20.360776 extend-filesystems[1650]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Dec 16 12:53:20.360776 extend-filesystems[1650]: old_desc_blocks = 1, new_desc_blocks = 5 Dec 16 12:53:20.360776 extend-filesystems[1650]: The filesystem on /dev/sda9 is now 8410107 (4k) blocks long. Dec 16 12:53:20.369113 extend-filesystems[1608]: Resized filesystem in /dev/sda9 Dec 16 12:53:20.362692 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 12:53:20.376460 bash[1686]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:53:20.362957 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 12:53:20.374083 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 12:53:20.379742 systemd[1]: Starting sshkeys.service... Dec 16 12:53:20.437754 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 12:53:20.441514 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 12:53:20.446304 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 12:53:20.532276 coreos-metadata[1698]: Dec 16 12:53:20.532 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Dec 16 12:53:20.534484 coreos-metadata[1698]: Dec 16 12:53:20.534 INFO Fetch successful Dec 16 12:53:20.538485 unknown[1698]: wrote ssh authorized keys file for user: core Dec 16 12:53:20.543524 containerd[1649]: time="2025-12-16T12:53:20Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 12:53:20.547904 containerd[1649]: time="2025-12-16T12:53:20.547880560Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Dec 16 12:53:20.567262 sshd_keygen[1663]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 12:53:20.571693 update-ssh-keys[1703]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:53:20.571560 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 12:53:20.579318 systemd[1]: Finished sshkeys.service. Dec 16 12:53:20.585944 containerd[1649]: time="2025-12-16T12:53:20.585713556Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.068µs" Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586344410Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586393983Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586406336Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586532653Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586547070Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586600740Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586609877Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586805575Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586817117Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586826724Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586833317Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587449 containerd[1649]: time="2025-12-16T12:53:20.586951819Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587665 containerd[1649]: time="2025-12-16T12:53:20.586963090Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587665 containerd[1649]: time="2025-12-16T12:53:20.587025107Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587665 containerd[1649]: time="2025-12-16T12:53:20.587205825Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587665 containerd[1649]: time="2025-12-16T12:53:20.587230912Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:53:20.587665 containerd[1649]: time="2025-12-16T12:53:20.587239097Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 12:53:20.589018 containerd[1649]: time="2025-12-16T12:53:20.589001242Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 12:53:20.589761 containerd[1649]: time="2025-12-16T12:53:20.589745138Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 12:53:20.590262 containerd[1649]: time="2025-12-16T12:53:20.590226531Z" level=info msg="metadata content store policy set" policy=shared Dec 16 12:53:20.593928 locksmithd[1671]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 12:53:20.593992 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 12:53:20.595911 containerd[1649]: time="2025-12-16T12:53:20.595868923Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 12:53:20.595967 containerd[1649]: time="2025-12-16T12:53:20.595930148Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:53:20.596042 containerd[1649]: time="2025-12-16T12:53:20.596001762Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:53:20.596042 containerd[1649]: time="2025-12-16T12:53:20.596032179Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 12:53:20.596102 containerd[1649]: time="2025-12-16T12:53:20.596048660Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 12:53:20.596102 containerd[1649]: time="2025-12-16T12:53:20.596059861Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 12:53:20.596102 containerd[1649]: time="2025-12-16T12:53:20.596068949Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 12:53:20.596102 containerd[1649]: time="2025-12-16T12:53:20.596075731Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 12:53:20.596102 containerd[1649]: time="2025-12-16T12:53:20.596084207Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 12:53:20.596102 containerd[1649]: time="2025-12-16T12:53:20.596097422Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 12:53:20.596240 containerd[1649]: time="2025-12-16T12:53:20.596106669Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 12:53:20.596240 containerd[1649]: time="2025-12-16T12:53:20.596115225Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 12:53:20.596240 containerd[1649]: time="2025-12-16T12:53:20.596122228Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 12:53:20.596240 containerd[1649]: time="2025-12-16T12:53:20.596131886Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 12:53:20.600385 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 12:53:20.603551 systemd[1]: Started sshd@0-77.42.41.174:22-147.75.109.163:41666.service - OpenSSH per-connection server daemon (147.75.109.163:41666). Dec 16 12:53:20.606368 containerd[1649]: time="2025-12-16T12:53:20.606329913Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 12:53:20.606963 containerd[1649]: time="2025-12-16T12:53:20.606941750Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 12:53:20.607121 containerd[1649]: time="2025-12-16T12:53:20.607104887Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 12:53:20.607239 containerd[1649]: time="2025-12-16T12:53:20.607223408Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 12:53:20.607299 containerd[1649]: time="2025-12-16T12:53:20.607286296Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 12:53:20.607859 containerd[1649]: time="2025-12-16T12:53:20.607843782Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 12:53:20.607924 containerd[1649]: time="2025-12-16T12:53:20.607911449Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 12:53:20.608003 containerd[1649]: time="2025-12-16T12:53:20.607990788Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 12:53:20.608050 containerd[1649]: time="2025-12-16T12:53:20.608040571Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 12:53:20.608094 containerd[1649]: time="2025-12-16T12:53:20.608084313Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 12:53:20.608134 containerd[1649]: time="2025-12-16T12:53:20.608125991Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 12:53:20.608265 containerd[1649]: time="2025-12-16T12:53:20.608250144Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 12:53:20.608534 containerd[1649]: time="2025-12-16T12:53:20.608520662Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 12:53:20.608583 containerd[1649]: time="2025-12-16T12:53:20.608573581Z" level=info msg="Start snapshots syncer" Dec 16 12:53:20.608960 containerd[1649]: time="2025-12-16T12:53:20.608944076Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 12:53:20.609436 containerd[1649]: time="2025-12-16T12:53:20.609406283Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 12:53:20.609891 containerd[1649]: time="2025-12-16T12:53:20.609875733Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 12:53:20.609990 containerd[1649]: time="2025-12-16T12:53:20.609974318Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 12:53:20.610354 containerd[1649]: time="2025-12-16T12:53:20.610337399Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 12:53:20.610821 containerd[1649]: time="2025-12-16T12:53:20.610772285Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 12:53:20.610905 containerd[1649]: time="2025-12-16T12:53:20.610889685Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 12:53:20.611179 containerd[1649]: time="2025-12-16T12:53:20.611152227Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611509908Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611528492Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611539413Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611555263Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611565733Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611594437Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611610457Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611617871Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611625144Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611632488Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611643158Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611651153Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611664358Z" level=info msg="runtime interface created" Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611668385Z" level=info msg="created NRI interface" Dec 16 12:53:20.612173 containerd[1649]: time="2025-12-16T12:53:20.611677492Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 12:53:20.612399 containerd[1649]: time="2025-12-16T12:53:20.611687000Z" level=info msg="Connect containerd service" Dec 16 12:53:20.612399 containerd[1649]: time="2025-12-16T12:53:20.611703752Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 12:53:20.615690 containerd[1649]: time="2025-12-16T12:53:20.614870150Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:53:20.633341 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 12:53:20.633563 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 12:53:20.658351 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 12:53:20.675650 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 12:53:20.681493 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 12:53:20.686337 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 16 12:53:20.690218 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 12:53:20.742313 containerd[1649]: time="2025-12-16T12:53:20.742279027Z" level=info msg="Start subscribing containerd event" Dec 16 12:53:20.742473 containerd[1649]: time="2025-12-16T12:53:20.742448966Z" level=info msg="Start recovering state" Dec 16 12:53:20.742627 containerd[1649]: time="2025-12-16T12:53:20.742616630Z" level=info msg="Start event monitor" Dec 16 12:53:20.742686 containerd[1649]: time="2025-12-16T12:53:20.742667496Z" level=info msg="Start cni network conf syncer for default" Dec 16 12:53:20.742744 containerd[1649]: time="2025-12-16T12:53:20.742734992Z" level=info msg="Start streaming server" Dec 16 12:53:20.742825 containerd[1649]: time="2025-12-16T12:53:20.742785928Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 12:53:20.742870 containerd[1649]: time="2025-12-16T12:53:20.742861409Z" level=info msg="runtime interface starting up..." Dec 16 12:53:20.742928 containerd[1649]: time="2025-12-16T12:53:20.742913817Z" level=info msg="starting plugins..." Dec 16 12:53:20.743002 containerd[1649]: time="2025-12-16T12:53:20.742975433Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 12:53:20.743161 containerd[1649]: time="2025-12-16T12:53:20.743108533Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 12:53:20.743466 containerd[1649]: time="2025-12-16T12:53:20.743446706Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 12:53:20.743531 containerd[1649]: time="2025-12-16T12:53:20.743511318Z" level=info msg="containerd successfully booted in 0.201546s" Dec 16 12:53:20.743685 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 12:53:20.789115 tar[1641]: linux-amd64/README.md Dec 16 12:53:20.799851 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 12:53:21.516376 systemd-networkd[1544]: eth1: Gained IPv6LL Dec 16 12:53:21.518929 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 12:53:21.523420 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 12:53:21.526108 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:53:21.529352 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 12:53:21.552337 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 12:53:21.580322 systemd-networkd[1544]: eth0: Gained IPv6LL Dec 16 12:53:21.621489 sshd[1722]: Accepted publickey for core from 147.75.109.163 port 41666 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:53:21.623287 sshd-session[1722]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:53:21.629978 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 12:53:21.632902 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 12:53:21.643665 systemd-logind[1616]: New session 1 of user core. Dec 16 12:53:21.658478 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 12:53:21.661608 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 12:53:21.672452 (systemd)[1762]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 16 12:53:21.676219 systemd-logind[1616]: New session c1 of user core. Dec 16 12:53:21.799790 systemd[1762]: Queued start job for default target default.target. Dec 16 12:53:21.806396 systemd[1762]: Created slice app.slice - User Application Slice. Dec 16 12:53:21.806518 systemd[1762]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Dec 16 12:53:21.806589 systemd[1762]: Reached target paths.target - Paths. Dec 16 12:53:21.806627 systemd[1762]: Reached target timers.target - Timers. Dec 16 12:53:21.809263 systemd[1762]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 12:53:21.811294 systemd[1762]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Dec 16 12:53:21.818691 systemd[1762]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 12:53:21.819444 systemd[1762]: Reached target sockets.target - Sockets. Dec 16 12:53:21.823183 systemd[1762]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Dec 16 12:53:21.823276 systemd[1762]: Reached target basic.target - Basic System. Dec 16 12:53:21.823325 systemd[1762]: Reached target default.target - Main User Target. Dec 16 12:53:21.823349 systemd[1762]: Startup finished in 140ms. Dec 16 12:53:21.823550 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 12:53:21.828606 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 12:53:22.320009 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:53:22.321042 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 12:53:22.323933 (kubelet)[1779]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:53:22.324050 systemd[1]: Startup finished in 3.415s (kernel) + 5.912s (initrd) + 5.108s (userspace) = 14.435s. Dec 16 12:53:22.396416 systemd[1]: Started sshd@1-77.42.41.174:22-147.75.109.163:41668.service - OpenSSH per-connection server daemon (147.75.109.163:41668). Dec 16 12:53:22.848116 kubelet[1779]: E1216 12:53:22.848059 1779 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:53:22.850481 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:53:22.850613 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:53:22.850938 systemd[1]: kubelet.service: Consumed 845ms CPU time, 264M memory peak. Dec 16 12:53:23.356692 sshd[1781]: Accepted publickey for core from 147.75.109.163 port 41668 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:53:23.357874 sshd-session[1781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:53:23.363272 systemd-logind[1616]: New session 2 of user core. Dec 16 12:53:23.372311 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 16 12:53:23.913290 sshd[1795]: Connection closed by 147.75.109.163 port 41668 Dec 16 12:53:23.913816 sshd-session[1781]: pam_unix(sshd:session): session closed for user core Dec 16 12:53:23.916785 systemd[1]: sshd@1-77.42.41.174:22-147.75.109.163:41668.service: Deactivated successfully. Dec 16 12:53:23.918320 systemd[1]: session-2.scope: Deactivated successfully. Dec 16 12:53:23.919989 systemd-logind[1616]: Session 2 logged out. Waiting for processes to exit. Dec 16 12:53:23.920967 systemd-logind[1616]: Removed session 2. Dec 16 12:53:24.080524 systemd[1]: Started sshd@2-77.42.41.174:22-147.75.109.163:60048.service - OpenSSH per-connection server daemon (147.75.109.163:60048). Dec 16 12:53:24.972593 sshd[1801]: Accepted publickey for core from 147.75.109.163 port 60048 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:53:24.973817 sshd-session[1801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:53:24.978232 systemd-logind[1616]: New session 3 of user core. Dec 16 12:53:24.984337 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 12:53:25.481202 sshd[1804]: Connection closed by 147.75.109.163 port 60048 Dec 16 12:53:25.481709 sshd-session[1801]: pam_unix(sshd:session): session closed for user core Dec 16 12:53:25.485371 systemd[1]: sshd@2-77.42.41.174:22-147.75.109.163:60048.service: Deactivated successfully. Dec 16 12:53:25.486883 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 12:53:25.488100 systemd-logind[1616]: Session 3 logged out. Waiting for processes to exit. Dec 16 12:53:25.489632 systemd-logind[1616]: Removed session 3. Dec 16 12:53:25.696418 systemd[1]: Started sshd@3-77.42.41.174:22-147.75.109.163:60052.service - OpenSSH per-connection server daemon (147.75.109.163:60052). Dec 16 12:53:26.677232 sshd[1810]: Accepted publickey for core from 147.75.109.163 port 60052 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:53:26.678751 sshd-session[1810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:53:26.683231 systemd-logind[1616]: New session 4 of user core. Dec 16 12:53:26.692339 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 12:53:27.246534 sshd[1813]: Connection closed by 147.75.109.163 port 60052 Dec 16 12:53:27.247035 sshd-session[1810]: pam_unix(sshd:session): session closed for user core Dec 16 12:53:27.249884 systemd[1]: sshd@3-77.42.41.174:22-147.75.109.163:60052.service: Deactivated successfully. Dec 16 12:53:27.251557 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 12:53:27.253669 systemd-logind[1616]: Session 4 logged out. Waiting for processes to exit. Dec 16 12:53:27.254900 systemd-logind[1616]: Removed session 4. Dec 16 12:53:27.407411 systemd[1]: Started sshd@4-77.42.41.174:22-147.75.109.163:60058.service - OpenSSH per-connection server daemon (147.75.109.163:60058). Dec 16 12:53:28.299406 sshd[1819]: Accepted publickey for core from 147.75.109.163 port 60058 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:53:28.300700 sshd-session[1819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:53:28.306646 systemd-logind[1616]: New session 5 of user core. Dec 16 12:53:28.311371 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 12:53:28.654929 sudo[1823]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 12:53:28.655349 sudo[1823]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:53:28.670661 sudo[1823]: pam_unix(sudo:session): session closed for user root Dec 16 12:53:28.840350 sshd[1822]: Connection closed by 147.75.109.163 port 60058 Dec 16 12:53:28.841066 sshd-session[1819]: pam_unix(sshd:session): session closed for user core Dec 16 12:53:28.845640 systemd[1]: sshd@4-77.42.41.174:22-147.75.109.163:60058.service: Deactivated successfully. Dec 16 12:53:28.847497 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 12:53:28.849082 systemd-logind[1616]: Session 5 logged out. Waiting for processes to exit. Dec 16 12:53:28.850685 systemd-logind[1616]: Removed session 5. Dec 16 12:53:29.030043 systemd[1]: Started sshd@5-77.42.41.174:22-147.75.109.163:60072.service - OpenSSH per-connection server daemon (147.75.109.163:60072). Dec 16 12:53:29.909755 sshd[1829]: Accepted publickey for core from 147.75.109.163 port 60072 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:53:29.911067 sshd-session[1829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:53:29.916479 systemd-logind[1616]: New session 6 of user core. Dec 16 12:53:29.924402 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 12:53:30.251368 sudo[1834]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 12:53:30.251664 sudo[1834]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:53:30.256566 sudo[1834]: pam_unix(sudo:session): session closed for user root Dec 16 12:53:30.262628 sudo[1833]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 12:53:30.262975 sudo[1833]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:53:30.273002 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:53:30.317081 kernel: kauditd_printk_skb: 73 callbacks suppressed Dec 16 12:53:30.317209 kernel: audit: type=1305 audit(1765889610.311:229): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:53:30.311000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:53:30.315246 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:53:30.317377 augenrules[1856]: No rules Dec 16 12:53:30.315496 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:53:30.317802 sudo[1833]: pam_unix(sudo:session): session closed for user root Dec 16 12:53:30.311000 audit[1856]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffe8220fcf0 a2=420 a3=0 items=0 ppid=1837 pid=1856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:30.311000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:53:30.328323 kernel: audit: type=1300 audit(1765889610.311:229): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffe8220fcf0 a2=420 a3=0 items=0 ppid=1837 pid=1856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:30.328371 kernel: audit: type=1327 audit(1765889610.311:229): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:53:30.328395 kernel: audit: type=1130 audit(1765889610.314:230): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.314000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.314000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.338358 kernel: audit: type=1131 audit(1765889610.314:231): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.316000 audit[1833]: USER_END pid=1833 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.344055 kernel: audit: type=1106 audit(1765889610.316:232): pid=1833 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.316000 audit[1833]: CRED_DISP pid=1833 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.349194 kernel: audit: type=1104 audit(1765889610.316:233): pid=1833 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.485461 sshd[1832]: Connection closed by 147.75.109.163 port 60072 Dec 16 12:53:30.485932 sshd-session[1829]: pam_unix(sshd:session): session closed for user core Dec 16 12:53:30.486000 audit[1829]: USER_END pid=1829 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:53:30.489911 systemd[1]: sshd@5-77.42.41.174:22-147.75.109.163:60072.service: Deactivated successfully. Dec 16 12:53:30.491606 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 12:53:30.494250 kernel: audit: type=1106 audit(1765889610.486:234): pid=1829 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:53:30.494303 kernel: audit: type=1104 audit(1765889610.486:235): pid=1829 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:53:30.486000 audit[1829]: CRED_DISP pid=1829 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:53:30.493810 systemd-logind[1616]: Session 6 logged out. Waiting for processes to exit. Dec 16 12:53:30.494956 systemd-logind[1616]: Removed session 6. Dec 16 12:53:30.486000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-77.42.41.174:22-147.75.109.163:60072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.500585 kernel: audit: type=1131 audit(1765889610.486:236): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-77.42.41.174:22-147.75.109.163:60072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:30.665452 systemd[1]: Started sshd@6-77.42.41.174:22-147.75.109.163:60086.service - OpenSSH per-connection server daemon (147.75.109.163:60086). Dec 16 12:53:30.665000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.41.174:22-147.75.109.163:60086 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:31.551000 audit[1865]: USER_ACCT pid=1865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:53:31.552596 sshd[1865]: Accepted publickey for core from 147.75.109.163 port 60086 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:53:31.552000 audit[1865]: CRED_ACQ pid=1865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:53:31.552000 audit[1865]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd71762e70 a2=3 a3=0 items=0 ppid=1 pid=1865 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=7 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:31.552000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:53:31.553874 sshd-session[1865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:53:31.559463 systemd-logind[1616]: New session 7 of user core. Dec 16 12:53:31.573381 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 12:53:31.574000 audit[1865]: USER_START pid=1865 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:53:31.576000 audit[1868]: CRED_ACQ pid=1868 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:53:31.897000 audit[1869]: USER_ACCT pid=1869 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:53:31.898448 sudo[1869]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 12:53:31.897000 audit[1869]: CRED_REFR pid=1869 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:53:31.898702 sudo[1869]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:53:31.899000 audit[1869]: USER_START pid=1869 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:53:32.250980 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 12:53:32.267464 (dockerd)[1886]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 12:53:32.511487 dockerd[1886]: time="2025-12-16T12:53:32.511251183Z" level=info msg="Starting up" Dec 16 12:53:32.514097 dockerd[1886]: time="2025-12-16T12:53:32.513782181Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 12:53:32.522618 dockerd[1886]: time="2025-12-16T12:53:32.522591091Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 12:53:32.535148 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2389387878-merged.mount: Deactivated successfully. Dec 16 12:53:32.561793 dockerd[1886]: time="2025-12-16T12:53:32.561761817Z" level=info msg="Loading containers: start." Dec 16 12:53:32.572197 kernel: Initializing XFRM netlink socket Dec 16 12:53:32.618000 audit[1933]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1933 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.618000 audit[1933]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffda9f359e0 a2=0 a3=0 items=0 ppid=1886 pid=1933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.618000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:53:32.620000 audit[1935]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1935 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.620000 audit[1935]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffc928f7490 a2=0 a3=0 items=0 ppid=1886 pid=1935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.620000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:53:32.621000 audit[1937]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1937 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.621000 audit[1937]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe13d9c320 a2=0 a3=0 items=0 ppid=1886 pid=1937 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.621000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:53:32.623000 audit[1939]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1939 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.623000 audit[1939]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff3b4c8de0 a2=0 a3=0 items=0 ppid=1886 pid=1939 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.623000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:53:32.624000 audit[1941]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=1941 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.624000 audit[1941]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcbe09e970 a2=0 a3=0 items=0 ppid=1886 pid=1941 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.624000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:53:32.626000 audit[1943]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=1943 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.626000 audit[1943]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fffc4da8530 a2=0 a3=0 items=0 ppid=1886 pid=1943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.626000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:53:32.627000 audit[1945]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1945 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.627000 audit[1945]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffec0b53870 a2=0 a3=0 items=0 ppid=1886 pid=1945 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.627000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:53:32.629000 audit[1947]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=1947 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.629000 audit[1947]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffdb9dcd7c0 a2=0 a3=0 items=0 ppid=1886 pid=1947 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.629000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:53:32.654000 audit[1950]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=1950 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.654000 audit[1950]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7fff778a8c20 a2=0 a3=0 items=0 ppid=1886 pid=1950 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.654000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Dec 16 12:53:32.656000 audit[1952]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=1952 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.656000 audit[1952]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc28590bc0 a2=0 a3=0 items=0 ppid=1886 pid=1952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.656000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:53:32.658000 audit[1954]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1954 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.658000 audit[1954]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffd87872890 a2=0 a3=0 items=0 ppid=1886 pid=1954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.658000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:53:32.660000 audit[1956]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=1956 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.660000 audit[1956]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7fff9db57b20 a2=0 a3=0 items=0 ppid=1886 pid=1956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.660000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:53:32.663000 audit[1958]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=1958 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.663000 audit[1958]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffe07e478d0 a2=0 a3=0 items=0 ppid=1886 pid=1958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.663000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:53:32.694000 audit[1988]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=1988 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.694000 audit[1988]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffee19da6a0 a2=0 a3=0 items=0 ppid=1886 pid=1988 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.694000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:53:32.695000 audit[1990]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=1990 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.695000 audit[1990]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffd506d59d0 a2=0 a3=0 items=0 ppid=1886 pid=1990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.695000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:53:32.697000 audit[1992]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=1992 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.697000 audit[1992]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe72642930 a2=0 a3=0 items=0 ppid=1886 pid=1992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.697000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:53:32.699000 audit[1994]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=1994 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.699000 audit[1994]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcaf5ad410 a2=0 a3=0 items=0 ppid=1886 pid=1994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.699000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:53:32.700000 audit[1996]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=1996 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.700000 audit[1996]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcfc650380 a2=0 a3=0 items=0 ppid=1886 pid=1996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.700000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:53:32.702000 audit[1998]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=1998 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.702000 audit[1998]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe95ced870 a2=0 a3=0 items=0 ppid=1886 pid=1998 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.702000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:53:32.703000 audit[2000]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2000 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.703000 audit[2000]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe1371a3d0 a2=0 a3=0 items=0 ppid=1886 pid=2000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.703000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:53:32.705000 audit[2002]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2002 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.705000 audit[2002]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffd766fbc30 a2=0 a3=0 items=0 ppid=1886 pid=2002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.705000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:53:32.707000 audit[2004]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2004 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.707000 audit[2004]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7fffbaf72540 a2=0 a3=0 items=0 ppid=1886 pid=2004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.707000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Dec 16 12:53:32.708000 audit[2006]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2006 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.708000 audit[2006]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffcc78ede80 a2=0 a3=0 items=0 ppid=1886 pid=2006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.708000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:53:32.710000 audit[2008]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2008 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.710000 audit[2008]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffde91baea0 a2=0 a3=0 items=0 ppid=1886 pid=2008 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.710000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:53:32.712000 audit[2010]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2010 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.712000 audit[2010]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7fff49e4aee0 a2=0 a3=0 items=0 ppid=1886 pid=2010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.712000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:53:32.713000 audit[2012]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2012 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.713000 audit[2012]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffccfc811c0 a2=0 a3=0 items=0 ppid=1886 pid=2012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.713000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:53:32.717000 audit[2017]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2017 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.717000 audit[2017]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc8f3c9160 a2=0 a3=0 items=0 ppid=1886 pid=2017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.717000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:53:32.719000 audit[2019]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2019 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.719000 audit[2019]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffd0f9e6120 a2=0 a3=0 items=0 ppid=1886 pid=2019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.719000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:53:32.720000 audit[2021]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2021 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.720000 audit[2021]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffeb3358900 a2=0 a3=0 items=0 ppid=1886 pid=2021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.720000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:53:32.722000 audit[2023]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2023 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.722000 audit[2023]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcad55f640 a2=0 a3=0 items=0 ppid=1886 pid=2023 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.722000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:53:32.724000 audit[2025]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2025 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.724000 audit[2025]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7fffd9225b30 a2=0 a3=0 items=0 ppid=1886 pid=2025 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.724000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:53:32.725000 audit[2027]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2027 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:32.725000 audit[2027]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffce9b6a890 a2=0 a3=0 items=0 ppid=1886 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.725000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:53:32.744000 audit[2031]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.744000 audit[2031]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffefcabd7e0 a2=0 a3=0 items=0 ppid=1886 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.744000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Dec 16 12:53:32.752000 audit[2035]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.752000 audit[2035]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffdf5da75a0 a2=0 a3=0 items=0 ppid=1886 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.752000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Dec 16 12:53:32.759000 audit[2043]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.759000 audit[2043]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7fffbf27c7d0 a2=0 a3=0 items=0 ppid=1886 pid=2043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.759000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Dec 16 12:53:32.766000 audit[2049]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2049 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.766000 audit[2049]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffc69dd16b0 a2=0 a3=0 items=0 ppid=1886 pid=2049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.766000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Dec 16 12:53:32.768000 audit[2051]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.768000 audit[2051]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7fffd9a18140 a2=0 a3=0 items=0 ppid=1886 pid=2051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.768000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Dec 16 12:53:32.770000 audit[2053]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.770000 audit[2053]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffca85ad230 a2=0 a3=0 items=0 ppid=1886 pid=2053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.770000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Dec 16 12:53:32.772000 audit[2055]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2055 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.772000 audit[2055]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffd15c80360 a2=0 a3=0 items=0 ppid=1886 pid=2055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.772000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:53:32.773000 audit[2057]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:32.773000 audit[2057]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd7738c540 a2=0 a3=0 items=0 ppid=1886 pid=2057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:32.773000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Dec 16 12:53:32.775611 systemd-networkd[1544]: docker0: Link UP Dec 16 12:53:32.780174 dockerd[1886]: time="2025-12-16T12:53:32.780115191Z" level=info msg="Loading containers: done." Dec 16 12:53:32.791954 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1520962241-merged.mount: Deactivated successfully. Dec 16 12:53:32.799457 dockerd[1886]: time="2025-12-16T12:53:32.799390732Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 12:53:32.799578 dockerd[1886]: time="2025-12-16T12:53:32.799478978Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 12:53:32.799578 dockerd[1886]: time="2025-12-16T12:53:32.799551814Z" level=info msg="Initializing buildkit" Dec 16 12:53:32.820317 dockerd[1886]: time="2025-12-16T12:53:32.820283897Z" level=info msg="Completed buildkit initialization" Dec 16 12:53:32.829028 dockerd[1886]: time="2025-12-16T12:53:32.828994103Z" level=info msg="Daemon has completed initialization" Dec 16 12:53:32.829119 dockerd[1886]: time="2025-12-16T12:53:32.829039379Z" level=info msg="API listen on /run/docker.sock" Dec 16 12:53:32.829388 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 12:53:32.828000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:33.101449 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 12:53:33.104627 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:53:33.227054 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:53:33.226000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:33.230039 (kubelet)[2104]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:53:33.275805 kubelet[2104]: E1216 12:53:33.275756 2104 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:53:33.280472 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:53:33.280587 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:53:33.280000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:53:33.281834 systemd[1]: kubelet.service: Consumed 132ms CPU time, 111.3M memory peak. Dec 16 12:53:33.976723 containerd[1649]: time="2025-12-16T12:53:33.976659161Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\"" Dec 16 12:53:34.500900 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4150626027.mount: Deactivated successfully. Dec 16 12:53:35.237467 containerd[1649]: time="2025-12-16T12:53:35.237413429Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:35.238668 containerd[1649]: time="2025-12-16T12:53:35.238529101Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.10: active requests=0, bytes read=27403437" Dec 16 12:53:35.239466 containerd[1649]: time="2025-12-16T12:53:35.239440671Z" level=info msg="ImageCreate event name:\"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:35.241765 containerd[1649]: time="2025-12-16T12:53:35.241745393Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:35.242346 containerd[1649]: time="2025-12-16T12:53:35.242321243Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.10\" with image id \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\", size \"29068782\" in 1.265621385s" Dec 16 12:53:35.242392 containerd[1649]: time="2025-12-16T12:53:35.242353183Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\" returns image reference \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\"" Dec 16 12:53:35.243035 containerd[1649]: time="2025-12-16T12:53:35.243010806Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\"" Dec 16 12:53:36.321773 containerd[1649]: time="2025-12-16T12:53:36.321703816Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:36.322777 containerd[1649]: time="2025-12-16T12:53:36.322647396Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.10: active requests=0, bytes read=24983855" Dec 16 12:53:36.323564 containerd[1649]: time="2025-12-16T12:53:36.323539228Z" level=info msg="ImageCreate event name:\"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:36.328296 containerd[1649]: time="2025-12-16T12:53:36.328271403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:36.328474 containerd[1649]: time="2025-12-16T12:53:36.328454117Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.10\" with image id \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\", size \"26649046\" in 1.085416269s" Dec 16 12:53:36.328535 containerd[1649]: time="2025-12-16T12:53:36.328522755Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\" returns image reference \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\"" Dec 16 12:53:36.329125 containerd[1649]: time="2025-12-16T12:53:36.329098595Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\"" Dec 16 12:53:37.269409 containerd[1649]: time="2025-12-16T12:53:37.269359454Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:37.270177 containerd[1649]: time="2025-12-16T12:53:37.270138004Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.10: active requests=0, bytes read=19396111" Dec 16 12:53:37.271305 containerd[1649]: time="2025-12-16T12:53:37.271014699Z" level=info msg="ImageCreate event name:\"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:37.272943 containerd[1649]: time="2025-12-16T12:53:37.272918759Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:37.273685 containerd[1649]: time="2025-12-16T12:53:37.273665159Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.10\" with image id \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\", size \"21061302\" in 944.541157ms" Dec 16 12:53:37.273769 containerd[1649]: time="2025-12-16T12:53:37.273756651Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\" returns image reference \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\"" Dec 16 12:53:37.274240 containerd[1649]: time="2025-12-16T12:53:37.274115935Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\"" Dec 16 12:53:38.265531 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3633912754.mount: Deactivated successfully. Dec 16 12:53:38.547357 containerd[1649]: time="2025-12-16T12:53:38.547235934Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:38.548424 containerd[1649]: time="2025-12-16T12:53:38.548302274Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.10: active requests=0, bytes read=31157702" Dec 16 12:53:38.549125 containerd[1649]: time="2025-12-16T12:53:38.549097666Z" level=info msg="ImageCreate event name:\"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:38.550708 containerd[1649]: time="2025-12-16T12:53:38.550678470Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:38.551124 containerd[1649]: time="2025-12-16T12:53:38.551095944Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.10\" with image id \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\", repo tag \"registry.k8s.io/kube-proxy:v1.32.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\", size \"31160442\" in 1.2767887s" Dec 16 12:53:38.551575 containerd[1649]: time="2025-12-16T12:53:38.551206571Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\" returns image reference \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\"" Dec 16 12:53:38.551674 containerd[1649]: time="2025-12-16T12:53:38.551637970Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Dec 16 12:53:39.036973 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2743039543.mount: Deactivated successfully. Dec 16 12:53:39.625976 containerd[1649]: time="2025-12-16T12:53:39.625907032Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:39.626972 containerd[1649]: time="2025-12-16T12:53:39.626838189Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=17569900" Dec 16 12:53:39.627636 containerd[1649]: time="2025-12-16T12:53:39.627613283Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:39.629506 containerd[1649]: time="2025-12-16T12:53:39.629484973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:39.630233 containerd[1649]: time="2025-12-16T12:53:39.630203070Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.078529623s" Dec 16 12:53:39.630293 containerd[1649]: time="2025-12-16T12:53:39.630282599Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Dec 16 12:53:39.630935 containerd[1649]: time="2025-12-16T12:53:39.630903994Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 16 12:53:40.785902 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1819003354.mount: Deactivated successfully. Dec 16 12:53:40.792800 containerd[1649]: time="2025-12-16T12:53:40.792745841Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:53:40.794247 containerd[1649]: time="2025-12-16T12:53:40.793853749Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:53:40.795345 containerd[1649]: time="2025-12-16T12:53:40.795277489Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:53:40.797743 containerd[1649]: time="2025-12-16T12:53:40.797678753Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:53:40.799203 containerd[1649]: time="2025-12-16T12:53:40.798883172Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.167949742s" Dec 16 12:53:40.799203 containerd[1649]: time="2025-12-16T12:53:40.798925211Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Dec 16 12:53:40.799913 containerd[1649]: time="2025-12-16T12:53:40.799859583Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Dec 16 12:53:41.254263 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1218191974.mount: Deactivated successfully. Dec 16 12:53:43.531015 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 12:53:43.533294 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:53:43.646492 kernel: kauditd_printk_skb: 134 callbacks suppressed Dec 16 12:53:43.646557 kernel: audit: type=1130 audit(1765889623.639:289): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:43.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:43.640204 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:53:43.652368 (kubelet)[2302]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:53:43.731999 kubelet[2302]: E1216 12:53:43.731749 2302 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:53:43.733000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:53:43.734508 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:53:43.734625 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:53:43.734907 systemd[1]: kubelet.service: Consumed 116ms CPU time, 109.5M memory peak. Dec 16 12:53:43.740185 kernel: audit: type=1131 audit(1765889623.733:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:53:43.908120 containerd[1649]: time="2025-12-16T12:53:43.907989332Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:43.909785 containerd[1649]: time="2025-12-16T12:53:43.909505576Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=55728979" Dec 16 12:53:43.910688 containerd[1649]: time="2025-12-16T12:53:43.910619164Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:43.914184 containerd[1649]: time="2025-12-16T12:53:43.912912134Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:53:43.914406 containerd[1649]: time="2025-12-16T12:53:43.914374537Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.114467415s" Dec 16 12:53:43.914509 containerd[1649]: time="2025-12-16T12:53:43.914487589Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Dec 16 12:53:46.310960 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:53:46.311107 systemd[1]: kubelet.service: Consumed 116ms CPU time, 109.5M memory peak. Dec 16 12:53:46.309000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:46.314966 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:53:46.309000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:46.320182 kernel: audit: type=1130 audit(1765889626.309:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:46.320231 kernel: audit: type=1131 audit(1765889626.309:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:46.341753 systemd[1]: Reload requested from client PID 2338 ('systemctl') (unit session-7.scope)... Dec 16 12:53:46.341870 systemd[1]: Reloading... Dec 16 12:53:46.421183 zram_generator::config[2384]: No configuration found. Dec 16 12:53:46.602621 systemd[1]: Reloading finished in 260 ms. Dec 16 12:53:46.636000 audit: BPF prog-id=63 op=LOAD Dec 16 12:53:46.636000 audit: BPF prog-id=55 op=UNLOAD Dec 16 12:53:46.637000 audit: BPF prog-id=64 op=LOAD Dec 16 12:53:46.638000 audit: BPF prog-id=65 op=LOAD Dec 16 12:53:46.638000 audit: BPF prog-id=56 op=UNLOAD Dec 16 12:53:46.638000 audit: BPF prog-id=57 op=UNLOAD Dec 16 12:53:46.639000 audit: BPF prog-id=66 op=LOAD Dec 16 12:53:46.639000 audit: BPF prog-id=52 op=UNLOAD Dec 16 12:53:46.639000 audit: BPF prog-id=67 op=LOAD Dec 16 12:53:46.639000 audit: BPF prog-id=68 op=LOAD Dec 16 12:53:46.639000 audit: BPF prog-id=53 op=UNLOAD Dec 16 12:53:46.639000 audit: BPF prog-id=54 op=UNLOAD Dec 16 12:53:46.639000 audit: BPF prog-id=69 op=LOAD Dec 16 12:53:46.641186 kernel: audit: type=1334 audit(1765889626.636:293): prog-id=63 op=LOAD Dec 16 12:53:46.641222 kernel: audit: type=1334 audit(1765889626.636:294): prog-id=55 op=UNLOAD Dec 16 12:53:46.641258 kernel: audit: type=1334 audit(1765889626.637:295): prog-id=64 op=LOAD Dec 16 12:53:46.641276 kernel: audit: type=1334 audit(1765889626.638:296): prog-id=65 op=LOAD Dec 16 12:53:46.641309 kernel: audit: type=1334 audit(1765889626.638:297): prog-id=56 op=UNLOAD Dec 16 12:53:46.641356 kernel: audit: type=1334 audit(1765889626.638:298): prog-id=57 op=UNLOAD Dec 16 12:53:46.639000 audit: BPF prog-id=58 op=UNLOAD Dec 16 12:53:46.643000 audit: BPF prog-id=70 op=LOAD Dec 16 12:53:46.643000 audit: BPF prog-id=60 op=UNLOAD Dec 16 12:53:46.643000 audit: BPF prog-id=71 op=LOAD Dec 16 12:53:46.643000 audit: BPF prog-id=72 op=LOAD Dec 16 12:53:46.643000 audit: BPF prog-id=61 op=UNLOAD Dec 16 12:53:46.643000 audit: BPF prog-id=62 op=UNLOAD Dec 16 12:53:46.644000 audit: BPF prog-id=73 op=LOAD Dec 16 12:53:46.644000 audit: BPF prog-id=74 op=LOAD Dec 16 12:53:46.644000 audit: BPF prog-id=46 op=UNLOAD Dec 16 12:53:46.644000 audit: BPF prog-id=47 op=UNLOAD Dec 16 12:53:46.644000 audit: BPF prog-id=75 op=LOAD Dec 16 12:53:46.644000 audit: BPF prog-id=43 op=UNLOAD Dec 16 12:53:46.644000 audit: BPF prog-id=76 op=LOAD Dec 16 12:53:46.644000 audit: BPF prog-id=77 op=LOAD Dec 16 12:53:46.644000 audit: BPF prog-id=44 op=UNLOAD Dec 16 12:53:46.644000 audit: BPF prog-id=45 op=UNLOAD Dec 16 12:53:46.646000 audit: BPF prog-id=78 op=LOAD Dec 16 12:53:46.646000 audit: BPF prog-id=48 op=UNLOAD Dec 16 12:53:46.646000 audit: BPF prog-id=79 op=LOAD Dec 16 12:53:46.646000 audit: BPF prog-id=80 op=LOAD Dec 16 12:53:46.646000 audit: BPF prog-id=49 op=UNLOAD Dec 16 12:53:46.646000 audit: BPF prog-id=50 op=UNLOAD Dec 16 12:53:46.647000 audit: BPF prog-id=81 op=LOAD Dec 16 12:53:46.647000 audit: BPF prog-id=51 op=UNLOAD Dec 16 12:53:46.647000 audit: BPF prog-id=82 op=LOAD Dec 16 12:53:46.647000 audit: BPF prog-id=59 op=UNLOAD Dec 16 12:53:46.660139 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 12:53:46.660417 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 12:53:46.659000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:53:46.660787 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:53:46.660943 systemd[1]: kubelet.service: Consumed 82ms CPU time, 98.3M memory peak. Dec 16 12:53:46.663318 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:53:46.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:46.757114 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:53:46.763615 (kubelet)[2438]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:53:46.803775 kubelet[2438]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:53:46.803775 kubelet[2438]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:53:46.803775 kubelet[2438]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:53:46.804075 kubelet[2438]: I1216 12:53:46.803901 2438 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:53:47.007395 kubelet[2438]: I1216 12:53:47.007311 2438 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 16 12:53:47.008123 kubelet[2438]: I1216 12:53:47.008109 2438 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:53:47.008493 kubelet[2438]: I1216 12:53:47.008479 2438 server.go:954] "Client rotation is on, will bootstrap in background" Dec 16 12:53:47.041146 kubelet[2438]: E1216 12:53:47.041082 2438 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://77.42.41.174:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 77.42.41.174:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:53:47.043233 kubelet[2438]: I1216 12:53:47.043195 2438 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:53:47.060862 kubelet[2438]: I1216 12:53:47.060823 2438 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:53:47.071148 kubelet[2438]: I1216 12:53:47.071109 2438 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 12:53:47.074249 kubelet[2438]: I1216 12:53:47.074196 2438 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:53:47.074421 kubelet[2438]: I1216 12:53:47.074230 2438 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-8-2e3d7ab7bb","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:53:47.074421 kubelet[2438]: I1216 12:53:47.074406 2438 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:53:47.074421 kubelet[2438]: I1216 12:53:47.074414 2438 container_manager_linux.go:304] "Creating device plugin manager" Dec 16 12:53:47.075695 kubelet[2438]: I1216 12:53:47.075655 2438 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:53:47.079390 kubelet[2438]: I1216 12:53:47.079369 2438 kubelet.go:446] "Attempting to sync node with API server" Dec 16 12:53:47.079446 kubelet[2438]: I1216 12:53:47.079395 2438 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:53:47.081695 kubelet[2438]: I1216 12:53:47.081353 2438 kubelet.go:352] "Adding apiserver pod source" Dec 16 12:53:47.081695 kubelet[2438]: I1216 12:53:47.081374 2438 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:53:47.088676 kubelet[2438]: W1216 12:53:47.088623 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://77.42.41.174:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-8-2e3d7ab7bb&limit=500&resourceVersion=0": dial tcp 77.42.41.174:6443: connect: connection refused Dec 16 12:53:47.089092 kubelet[2438]: E1216 12:53:47.089071 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://77.42.41.174:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-8-2e3d7ab7bb&limit=500&resourceVersion=0\": dial tcp 77.42.41.174:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:53:47.090791 kubelet[2438]: I1216 12:53:47.090767 2438 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:53:47.093582 kubelet[2438]: W1216 12:53:47.093517 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://77.42.41.174:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 77.42.41.174:6443: connect: connection refused Dec 16 12:53:47.093582 kubelet[2438]: E1216 12:53:47.093571 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://77.42.41.174:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 77.42.41.174:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:53:47.095020 kubelet[2438]: I1216 12:53:47.094914 2438 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 12:53:47.095640 kubelet[2438]: W1216 12:53:47.095623 2438 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 12:53:47.096337 kubelet[2438]: I1216 12:53:47.096320 2438 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 12:53:47.096450 kubelet[2438]: I1216 12:53:47.096439 2438 server.go:1287] "Started kubelet" Dec 16 12:53:47.101935 kubelet[2438]: I1216 12:53:47.101268 2438 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:53:47.103702 kubelet[2438]: I1216 12:53:47.103676 2438 server.go:479] "Adding debug handlers to kubelet server" Dec 16 12:53:47.106488 kubelet[2438]: I1216 12:53:47.106371 2438 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:53:47.106817 kubelet[2438]: I1216 12:53:47.106803 2438 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:53:47.112227 kubelet[2438]: I1216 12:53:47.111647 2438 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:53:47.118118 kubelet[2438]: E1216 12:53:47.112657 2438 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://77.42.41.174:6443/api/v1/namespaces/default/events\": dial tcp 77.42.41.174:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4515-1-0-8-2e3d7ab7bb.1881b3427fa106f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4515-1-0-8-2e3d7ab7bb,UID:ci-4515-1-0-8-2e3d7ab7bb,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-8-2e3d7ab7bb,},FirstTimestamp:2025-12-16 12:53:47.096418032 +0000 UTC m=+0.329370779,LastTimestamp:2025-12-16 12:53:47.096418032 +0000 UTC m=+0.329370779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-8-2e3d7ab7bb,}" Dec 16 12:53:47.118289 kubelet[2438]: I1216 12:53:47.118263 2438 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:53:47.121675 kubelet[2438]: E1216 12:53:47.121569 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:47.121675 kubelet[2438]: I1216 12:53:47.121602 2438 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 12:53:47.121803 kubelet[2438]: I1216 12:53:47.121784 2438 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 12:53:47.121835 kubelet[2438]: I1216 12:53:47.121831 2438 reconciler.go:26] "Reconciler: start to sync state" Dec 16 12:53:47.122348 kubelet[2438]: W1216 12:53:47.122145 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://77.42.41.174:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 77.42.41.174:6443: connect: connection refused Dec 16 12:53:47.122348 kubelet[2438]: E1216 12:53:47.122310 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://77.42.41.174:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 77.42.41.174:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:53:47.123609 kubelet[2438]: E1216 12:53:47.122509 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.41.174:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-8-2e3d7ab7bb?timeout=10s\": dial tcp 77.42.41.174:6443: connect: connection refused" interval="200ms" Dec 16 12:53:47.125000 audit[2450]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2450 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:47.125000 audit[2450]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff02e58b90 a2=0 a3=0 items=0 ppid=2438 pid=2450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.125000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:53:47.129000 audit[2451]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2451 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:47.129000 audit[2451]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff2a0a3730 a2=0 a3=0 items=0 ppid=2438 pid=2451 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.129000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:53:47.136000 audit[2453]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2453 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:47.136000 audit[2453]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe037034b0 a2=0 a3=0 items=0 ppid=2438 pid=2453 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.136000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:53:47.139509 kubelet[2438]: E1216 12:53:47.139414 2438 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:53:47.142043 kubelet[2438]: I1216 12:53:47.141991 2438 factory.go:221] Registration of the containerd container factory successfully Dec 16 12:53:47.142043 kubelet[2438]: I1216 12:53:47.142027 2438 factory.go:221] Registration of the systemd container factory successfully Dec 16 12:53:47.141000 audit[2455]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2455 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:47.142867 kubelet[2438]: I1216 12:53:47.142798 2438 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:53:47.141000 audit[2455]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fffbceb9270 a2=0 a3=0 items=0 ppid=2438 pid=2455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.141000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:53:47.160000 audit[2461]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2461 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:47.160000 audit[2461]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffc15c21f80 a2=0 a3=0 items=0 ppid=2438 pid=2461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.160000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Dec 16 12:53:47.162847 kubelet[2438]: I1216 12:53:47.162777 2438 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 12:53:47.162000 audit[2463]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2463 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:47.162000 audit[2463]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd4ff95810 a2=0 a3=0 items=0 ppid=2438 pid=2463 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.162000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:53:47.164311 kubelet[2438]: I1216 12:53:47.164297 2438 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 12:53:47.164397 kubelet[2438]: I1216 12:53:47.164390 2438 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 16 12:53:47.164470 kubelet[2438]: I1216 12:53:47.164446 2438 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:53:47.164512 kubelet[2438]: I1216 12:53:47.164506 2438 kubelet.go:2382] "Starting kubelet main sync loop" Dec 16 12:53:47.164612 kubelet[2438]: E1216 12:53:47.164594 2438 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:53:47.166375 kubelet[2438]: W1216 12:53:47.166346 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://77.42.41.174:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 77.42.41.174:6443: connect: connection refused Dec 16 12:53:47.166513 kubelet[2438]: E1216 12:53:47.166466 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://77.42.41.174:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 77.42.41.174:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:53:47.166000 audit[2465]: NETFILTER_CFG table=mangle:48 family=10 entries=1 op=nft_register_chain pid=2465 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:47.166000 audit[2465]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe73f88a90 a2=0 a3=0 items=0 ppid=2438 pid=2465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.166000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:53:47.166000 audit[2464]: NETFILTER_CFG table=mangle:49 family=2 entries=1 op=nft_register_chain pid=2464 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:47.166000 audit[2464]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdc827a5f0 a2=0 a3=0 items=0 ppid=2438 pid=2464 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.166000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:53:47.170000 audit[2468]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2468 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:47.170000 audit[2468]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff3cc4ff20 a2=0 a3=0 items=0 ppid=2438 pid=2468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.170000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:53:47.172000 audit[2469]: NETFILTER_CFG table=filter:51 family=10 entries=1 op=nft_register_chain pid=2469 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:47.172000 audit[2467]: NETFILTER_CFG table=nat:52 family=2 entries=1 op=nft_register_chain pid=2467 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:47.172000 audit[2467]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc7d5a3350 a2=0 a3=0 items=0 ppid=2438 pid=2467 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.172000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:53:47.172000 audit[2469]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff0d71e900 a2=0 a3=0 items=0 ppid=2438 pid=2469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.172000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:53:47.174000 audit[2470]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2470 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:47.174000 audit[2470]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe002baa60 a2=0 a3=0 items=0 ppid=2438 pid=2470 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.174000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:53:47.175934 kubelet[2438]: I1216 12:53:47.175904 2438 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:53:47.175934 kubelet[2438]: I1216 12:53:47.175916 2438 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:53:47.175934 kubelet[2438]: I1216 12:53:47.175930 2438 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:53:47.180892 kubelet[2438]: I1216 12:53:47.180874 2438 policy_none.go:49] "None policy: Start" Dec 16 12:53:47.180892 kubelet[2438]: I1216 12:53:47.180890 2438 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 12:53:47.180966 kubelet[2438]: I1216 12:53:47.180899 2438 state_mem.go:35] "Initializing new in-memory state store" Dec 16 12:53:47.186057 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 12:53:47.207134 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 12:53:47.210886 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 12:53:47.222184 kubelet[2438]: E1216 12:53:47.222137 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:47.223837 kubelet[2438]: I1216 12:53:47.223821 2438 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 12:53:47.224002 kubelet[2438]: I1216 12:53:47.223972 2438 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:53:47.224043 kubelet[2438]: I1216 12:53:47.223989 2438 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:53:47.224518 kubelet[2438]: I1216 12:53:47.224507 2438 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:53:47.225417 kubelet[2438]: E1216 12:53:47.225397 2438 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:53:47.225485 kubelet[2438]: E1216 12:53:47.225433 2438 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:47.276454 systemd[1]: Created slice kubepods-burstable-pod12e2af47f996b27ec9bf257ff79845f4.slice - libcontainer container kubepods-burstable-pod12e2af47f996b27ec9bf257ff79845f4.slice. Dec 16 12:53:47.278939 kubelet[2438]: W1216 12:53:47.278910 2438 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e2af47f996b27ec9bf257ff79845f4.slice/cpu.weight": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e2af47f996b27ec9bf257ff79845f4.slice/cpu.weight: no such device Dec 16 12:53:47.293287 kubelet[2438]: E1216 12:53:47.293241 2438 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.296670 systemd[1]: Created slice kubepods-burstable-pod1f2e41c58f9938b008f633e63e0502f3.slice - libcontainer container kubepods-burstable-pod1f2e41c58f9938b008f633e63e0502f3.slice. Dec 16 12:53:47.311308 kubelet[2438]: E1216 12:53:47.311290 2438 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.314487 systemd[1]: Created slice kubepods-burstable-poda45033eea666a2fd3bce119ad176bc71.slice - libcontainer container kubepods-burstable-poda45033eea666a2fd3bce119ad176bc71.slice. Dec 16 12:53:47.316461 kubelet[2438]: E1216 12:53:47.316437 2438 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.324700 kubelet[2438]: E1216 12:53:47.324628 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.41.174:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-8-2e3d7ab7bb?timeout=10s\": dial tcp 77.42.41.174:6443: connect: connection refused" interval="400ms" Dec 16 12:53:47.325939 kubelet[2438]: I1216 12:53:47.325922 2438 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.326219 kubelet[2438]: E1216 12:53:47.326182 2438 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.41.174:6443/api/v1/nodes\": dial tcp 77.42.41.174:6443: connect: connection refused" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.423778 kubelet[2438]: I1216 12:53:47.423702 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.423778 kubelet[2438]: I1216 12:53:47.423768 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.423778 kubelet[2438]: I1216 12:53:47.423791 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/12e2af47f996b27ec9bf257ff79845f4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"12e2af47f996b27ec9bf257ff79845f4\") " pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.423778 kubelet[2438]: I1216 12:53:47.423806 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.424069 kubelet[2438]: I1216 12:53:47.423821 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.424069 kubelet[2438]: I1216 12:53:47.423836 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.424069 kubelet[2438]: I1216 12:53:47.423851 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a45033eea666a2fd3bce119ad176bc71-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"a45033eea666a2fd3bce119ad176bc71\") " pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.424069 kubelet[2438]: I1216 12:53:47.423868 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/12e2af47f996b27ec9bf257ff79845f4-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"12e2af47f996b27ec9bf257ff79845f4\") " pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.424069 kubelet[2438]: I1216 12:53:47.423900 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/12e2af47f996b27ec9bf257ff79845f4-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"12e2af47f996b27ec9bf257ff79845f4\") " pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.529319 kubelet[2438]: I1216 12:53:47.529213 2438 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.529591 kubelet[2438]: E1216 12:53:47.529541 2438 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.41.174:6443/api/v1/nodes\": dial tcp 77.42.41.174:6443: connect: connection refused" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.594892 containerd[1649]: time="2025-12-16T12:53:47.594843144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb,Uid:12e2af47f996b27ec9bf257ff79845f4,Namespace:kube-system,Attempt:0,}" Dec 16 12:53:47.612942 containerd[1649]: time="2025-12-16T12:53:47.612852059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb,Uid:1f2e41c58f9938b008f633e63e0502f3,Namespace:kube-system,Attempt:0,}" Dec 16 12:53:47.617748 containerd[1649]: time="2025-12-16T12:53:47.617676307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb,Uid:a45033eea666a2fd3bce119ad176bc71,Namespace:kube-system,Attempt:0,}" Dec 16 12:53:47.726151 containerd[1649]: time="2025-12-16T12:53:47.726051987Z" level=info msg="connecting to shim d885db62aa1ff963adabdcb48f9b5e0fca3aacd61aacd377293e25e07f2e340c" address="unix:///run/containerd/s/55dcba99f89cf5d347774b19885886473d7f8eb45100aaa5b275880aa538adfb" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:53:47.726459 kubelet[2438]: E1216 12:53:47.726100 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.41.174:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-8-2e3d7ab7bb?timeout=10s\": dial tcp 77.42.41.174:6443: connect: connection refused" interval="800ms" Dec 16 12:53:47.729548 containerd[1649]: time="2025-12-16T12:53:47.729492089Z" level=info msg="connecting to shim 94969a0ca0fa954824a585cf6ed6ae67112f1bfc39a6b743836847d7f32853d2" address="unix:///run/containerd/s/671293f849e4a705eb9c9e8665b0efe15278e9f9a456893d0ea9c7377a22d34d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:53:47.736625 containerd[1649]: time="2025-12-16T12:53:47.736261495Z" level=info msg="connecting to shim bbf74f643a3ff0d4dc614ac308ee8832690ea0fbd93b520a5909256cc88f925b" address="unix:///run/containerd/s/b3e38070f6a3c8c234142c37076e1557bdb07a828fa397f3375263cc4ae17dc5" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:53:47.811311 systemd[1]: Started cri-containerd-d885db62aa1ff963adabdcb48f9b5e0fca3aacd61aacd377293e25e07f2e340c.scope - libcontainer container d885db62aa1ff963adabdcb48f9b5e0fca3aacd61aacd377293e25e07f2e340c. Dec 16 12:53:47.819455 systemd[1]: Started cri-containerd-94969a0ca0fa954824a585cf6ed6ae67112f1bfc39a6b743836847d7f32853d2.scope - libcontainer container 94969a0ca0fa954824a585cf6ed6ae67112f1bfc39a6b743836847d7f32853d2. Dec 16 12:53:47.822508 systemd[1]: Started cri-containerd-bbf74f643a3ff0d4dc614ac308ee8832690ea0fbd93b520a5909256cc88f925b.scope - libcontainer container bbf74f643a3ff0d4dc614ac308ee8832690ea0fbd93b520a5909256cc88f925b. Dec 16 12:53:47.842000 audit: BPF prog-id=83 op=LOAD Dec 16 12:53:47.843000 audit: BPF prog-id=84 op=LOAD Dec 16 12:53:47.843000 audit[2532]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2512 pid=2532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.843000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262663734663634336133666630643464633631346163333038656538 Dec 16 12:53:47.843000 audit: BPF prog-id=84 op=UNLOAD Dec 16 12:53:47.843000 audit[2532]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2512 pid=2532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.843000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262663734663634336133666630643464633631346163333038656538 Dec 16 12:53:47.844000 audit: BPF prog-id=85 op=LOAD Dec 16 12:53:47.845000 audit: BPF prog-id=86 op=LOAD Dec 16 12:53:47.845000 audit[2523]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000228238 a2=98 a3=0 items=0 ppid=2489 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438383564623632616131666639363361646162646362343866396235 Dec 16 12:53:47.845000 audit: BPF prog-id=86 op=UNLOAD Dec 16 12:53:47.845000 audit[2523]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438383564623632616131666639363361646162646362343866396235 Dec 16 12:53:47.845000 audit: BPF prog-id=87 op=LOAD Dec 16 12:53:47.845000 audit[2532]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2512 pid=2532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262663734663634336133666630643464633631346163333038656538 Dec 16 12:53:47.845000 audit: BPF prog-id=88 op=LOAD Dec 16 12:53:47.845000 audit[2532]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2512 pid=2532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262663734663634336133666630643464633631346163333038656538 Dec 16 12:53:47.845000 audit: BPF prog-id=88 op=UNLOAD Dec 16 12:53:47.845000 audit[2532]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2512 pid=2532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262663734663634336133666630643464633631346163333038656538 Dec 16 12:53:47.846000 audit: BPF prog-id=87 op=UNLOAD Dec 16 12:53:47.846000 audit[2532]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2512 pid=2532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262663734663634336133666630643464633631346163333038656538 Dec 16 12:53:47.846000 audit: BPF prog-id=89 op=LOAD Dec 16 12:53:47.846000 audit[2532]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2512 pid=2532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262663734663634336133666630643464633631346163333038656538 Dec 16 12:53:47.846000 audit: BPF prog-id=90 op=LOAD Dec 16 12:53:47.846000 audit: BPF prog-id=91 op=LOAD Dec 16 12:53:47.846000 audit[2523]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000228488 a2=98 a3=0 items=0 ppid=2489 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438383564623632616131666639363361646162646362343866396235 Dec 16 12:53:47.846000 audit: BPF prog-id=92 op=LOAD Dec 16 12:53:47.846000 audit[2523]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000228218 a2=98 a3=0 items=0 ppid=2489 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438383564623632616131666639363361646162646362343866396235 Dec 16 12:53:47.846000 audit: BPF prog-id=92 op=UNLOAD Dec 16 12:53:47.846000 audit[2523]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438383564623632616131666639363361646162646362343866396235 Dec 16 12:53:47.846000 audit: BPF prog-id=91 op=UNLOAD Dec 16 12:53:47.846000 audit[2523]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438383564623632616131666639363361646162646362343866396235 Dec 16 12:53:47.847000 audit: BPF prog-id=93 op=LOAD Dec 16 12:53:47.847000 audit[2534]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2497 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.847000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934393639613063613066613935343832346135383563663665643661 Dec 16 12:53:47.847000 audit: BPF prog-id=93 op=UNLOAD Dec 16 12:53:47.847000 audit[2534]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2497 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.847000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934393639613063613066613935343832346135383563663665643661 Dec 16 12:53:47.847000 audit: BPF prog-id=94 op=LOAD Dec 16 12:53:47.846000 audit: BPF prog-id=95 op=LOAD Dec 16 12:53:47.847000 audit[2534]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2497 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.847000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934393639613063613066613935343832346135383563663665643661 Dec 16 12:53:47.847000 audit: BPF prog-id=96 op=LOAD Dec 16 12:53:47.846000 audit[2523]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0002286e8 a2=98 a3=0 items=0 ppid=2489 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.847000 audit[2534]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2497 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.847000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934393639613063613066613935343832346135383563663665643661 Dec 16 12:53:47.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438383564623632616131666639363361646162646362343866396235 Dec 16 12:53:47.847000 audit: BPF prog-id=96 op=UNLOAD Dec 16 12:53:47.847000 audit[2534]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2497 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.847000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934393639613063613066613935343832346135383563663665643661 Dec 16 12:53:47.847000 audit: BPF prog-id=94 op=UNLOAD Dec 16 12:53:47.847000 audit[2534]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2497 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.847000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934393639613063613066613935343832346135383563663665643661 Dec 16 12:53:47.847000 audit: BPF prog-id=97 op=LOAD Dec 16 12:53:47.847000 audit[2534]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2497 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:47.847000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934393639613063613066613935343832346135383563663665643661 Dec 16 12:53:47.901533 containerd[1649]: time="2025-12-16T12:53:47.901487207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb,Uid:12e2af47f996b27ec9bf257ff79845f4,Namespace:kube-system,Attempt:0,} returns sandbox id \"bbf74f643a3ff0d4dc614ac308ee8832690ea0fbd93b520a5909256cc88f925b\"" Dec 16 12:53:47.906516 containerd[1649]: time="2025-12-16T12:53:47.906479891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb,Uid:a45033eea666a2fd3bce119ad176bc71,Namespace:kube-system,Attempt:0,} returns sandbox id \"94969a0ca0fa954824a585cf6ed6ae67112f1bfc39a6b743836847d7f32853d2\"" Dec 16 12:53:47.907404 containerd[1649]: time="2025-12-16T12:53:47.907366474Z" level=info msg="CreateContainer within sandbox \"bbf74f643a3ff0d4dc614ac308ee8832690ea0fbd93b520a5909256cc88f925b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 12:53:47.921900 containerd[1649]: time="2025-12-16T12:53:47.921868403Z" level=info msg="CreateContainer within sandbox \"94969a0ca0fa954824a585cf6ed6ae67112f1bfc39a6b743836847d7f32853d2\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 12:53:47.927646 containerd[1649]: time="2025-12-16T12:53:47.927591176Z" level=info msg="Container d42dc438497b3e08d78be8a4bd2cbee36675bb63623718b87ac591e9f5581928: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:53:47.935506 containerd[1649]: time="2025-12-16T12:53:47.935451467Z" level=info msg="Container ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:53:47.936363 kubelet[2438]: I1216 12:53:47.936322 2438 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.936821 kubelet[2438]: E1216 12:53:47.936654 2438 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.41.174:6443/api/v1/nodes\": dial tcp 77.42.41.174:6443: connect: connection refused" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:47.945577 containerd[1649]: time="2025-12-16T12:53:47.945535821Z" level=info msg="CreateContainer within sandbox \"bbf74f643a3ff0d4dc614ac308ee8832690ea0fbd93b520a5909256cc88f925b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d42dc438497b3e08d78be8a4bd2cbee36675bb63623718b87ac591e9f5581928\"" Dec 16 12:53:47.946041 containerd[1649]: time="2025-12-16T12:53:47.946019548Z" level=info msg="StartContainer for \"d42dc438497b3e08d78be8a4bd2cbee36675bb63623718b87ac591e9f5581928\"" Dec 16 12:53:47.947332 containerd[1649]: time="2025-12-16T12:53:47.947284841Z" level=info msg="connecting to shim d42dc438497b3e08d78be8a4bd2cbee36675bb63623718b87ac591e9f5581928" address="unix:///run/containerd/s/b3e38070f6a3c8c234142c37076e1557bdb07a828fa397f3375263cc4ae17dc5" protocol=ttrpc version=3 Dec 16 12:53:47.948931 containerd[1649]: time="2025-12-16T12:53:47.948761932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb,Uid:1f2e41c58f9938b008f633e63e0502f3,Namespace:kube-system,Attempt:0,} returns sandbox id \"d885db62aa1ff963adabdcb48f9b5e0fca3aacd61aacd377293e25e07f2e340c\"" Dec 16 12:53:47.953663 containerd[1649]: time="2025-12-16T12:53:47.953626675Z" level=info msg="CreateContainer within sandbox \"94969a0ca0fa954824a585cf6ed6ae67112f1bfc39a6b743836847d7f32853d2\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c\"" Dec 16 12:53:47.954307 containerd[1649]: time="2025-12-16T12:53:47.954276114Z" level=info msg="CreateContainer within sandbox \"d885db62aa1ff963adabdcb48f9b5e0fca3aacd61aacd377293e25e07f2e340c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 12:53:47.955329 containerd[1649]: time="2025-12-16T12:53:47.955298912Z" level=info msg="StartContainer for \"ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c\"" Dec 16 12:53:47.956223 containerd[1649]: time="2025-12-16T12:53:47.956200353Z" level=info msg="connecting to shim ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c" address="unix:///run/containerd/s/671293f849e4a705eb9c9e8665b0efe15278e9f9a456893d0ea9c7377a22d34d" protocol=ttrpc version=3 Dec 16 12:53:47.968191 containerd[1649]: time="2025-12-16T12:53:47.967270895Z" level=info msg="Container 520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:53:47.976779 containerd[1649]: time="2025-12-16T12:53:47.976717192Z" level=info msg="CreateContainer within sandbox \"d885db62aa1ff963adabdcb48f9b5e0fca3aacd61aacd377293e25e07f2e340c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c\"" Dec 16 12:53:47.977211 containerd[1649]: time="2025-12-16T12:53:47.977123504Z" level=info msg="StartContainer for \"520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c\"" Dec 16 12:53:47.978380 systemd[1]: Started cri-containerd-d42dc438497b3e08d78be8a4bd2cbee36675bb63623718b87ac591e9f5581928.scope - libcontainer container d42dc438497b3e08d78be8a4bd2cbee36675bb63623718b87ac591e9f5581928. Dec 16 12:53:47.980696 containerd[1649]: time="2025-12-16T12:53:47.980103904Z" level=info msg="connecting to shim 520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c" address="unix:///run/containerd/s/55dcba99f89cf5d347774b19885886473d7f8eb45100aaa5b275880aa538adfb" protocol=ttrpc version=3 Dec 16 12:53:47.991313 systemd[1]: Started cri-containerd-ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c.scope - libcontainer container ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c. Dec 16 12:53:47.992950 kubelet[2438]: W1216 12:53:47.992764 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://77.42.41.174:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-8-2e3d7ab7bb&limit=500&resourceVersion=0": dial tcp 77.42.41.174:6443: connect: connection refused Dec 16 12:53:47.992950 kubelet[2438]: E1216 12:53:47.992821 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://77.42.41.174:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-8-2e3d7ab7bb&limit=500&resourceVersion=0\": dial tcp 77.42.41.174:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:53:48.006322 systemd[1]: Started cri-containerd-520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c.scope - libcontainer container 520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c. Dec 16 12:53:48.008000 audit: BPF prog-id=98 op=LOAD Dec 16 12:53:48.010000 audit: BPF prog-id=99 op=LOAD Dec 16 12:53:48.010000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128238 a2=98 a3=0 items=0 ppid=2512 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326463343338343937623365303864373862653861346264326362 Dec 16 12:53:48.010000 audit: BPF prog-id=99 op=UNLOAD Dec 16 12:53:48.010000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2512 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326463343338343937623365303864373862653861346264326362 Dec 16 12:53:48.010000 audit: BPF prog-id=100 op=LOAD Dec 16 12:53:48.010000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128488 a2=98 a3=0 items=0 ppid=2512 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326463343338343937623365303864373862653861346264326362 Dec 16 12:53:48.010000 audit: BPF prog-id=101 op=LOAD Dec 16 12:53:48.010000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000128218 a2=98 a3=0 items=0 ppid=2512 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326463343338343937623365303864373862653861346264326362 Dec 16 12:53:48.010000 audit: BPF prog-id=101 op=UNLOAD Dec 16 12:53:48.010000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2512 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326463343338343937623365303864373862653861346264326362 Dec 16 12:53:48.010000 audit: BPF prog-id=100 op=UNLOAD Dec 16 12:53:48.010000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2512 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326463343338343937623365303864373862653861346264326362 Dec 16 12:53:48.010000 audit: BPF prog-id=102 op=LOAD Dec 16 12:53:48.010000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001286e8 a2=98 a3=0 items=0 ppid=2512 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326463343338343937623365303864373862653861346264326362 Dec 16 12:53:48.014000 audit: BPF prog-id=103 op=LOAD Dec 16 12:53:48.014000 audit: BPF prog-id=104 op=LOAD Dec 16 12:53:48.014000 audit[2613]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=2497 pid=2613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.014000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165313330613761326161393036363839343562303061626635373737 Dec 16 12:53:48.014000 audit: BPF prog-id=104 op=UNLOAD Dec 16 12:53:48.014000 audit[2613]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2497 pid=2613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.014000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165313330613761326161393036363839343562303061626635373737 Dec 16 12:53:48.015000 audit: BPF prog-id=105 op=LOAD Dec 16 12:53:48.015000 audit[2613]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=2497 pid=2613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.015000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165313330613761326161393036363839343562303061626635373737 Dec 16 12:53:48.015000 audit: BPF prog-id=106 op=LOAD Dec 16 12:53:48.015000 audit[2613]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=2497 pid=2613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.015000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165313330613761326161393036363839343562303061626635373737 Dec 16 12:53:48.015000 audit: BPF prog-id=106 op=UNLOAD Dec 16 12:53:48.015000 audit[2613]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2497 pid=2613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.015000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165313330613761326161393036363839343562303061626635373737 Dec 16 12:53:48.015000 audit: BPF prog-id=105 op=UNLOAD Dec 16 12:53:48.015000 audit[2613]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2497 pid=2613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.015000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165313330613761326161393036363839343562303061626635373737 Dec 16 12:53:48.015000 audit: BPF prog-id=107 op=LOAD Dec 16 12:53:48.015000 audit[2613]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=2497 pid=2613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.015000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165313330613761326161393036363839343562303061626635373737 Dec 16 12:53:48.027343 kubelet[2438]: W1216 12:53:48.027308 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://77.42.41.174:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 77.42.41.174:6443: connect: connection refused Dec 16 12:53:48.027448 kubelet[2438]: E1216 12:53:48.027354 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://77.42.41.174:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 77.42.41.174:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:53:48.028000 audit: BPF prog-id=108 op=LOAD Dec 16 12:53:48.028000 audit: BPF prog-id=109 op=LOAD Dec 16 12:53:48.028000 audit[2639]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2489 pid=2639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.028000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532306334656531343336663738363039373465316465353734376465 Dec 16 12:53:48.029000 audit: BPF prog-id=109 op=UNLOAD Dec 16 12:53:48.029000 audit[2639]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532306334656531343336663738363039373465316465353734376465 Dec 16 12:53:48.029000 audit: BPF prog-id=110 op=LOAD Dec 16 12:53:48.029000 audit[2639]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2489 pid=2639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532306334656531343336663738363039373465316465353734376465 Dec 16 12:53:48.029000 audit: BPF prog-id=111 op=LOAD Dec 16 12:53:48.029000 audit[2639]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2489 pid=2639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532306334656531343336663738363039373465316465353734376465 Dec 16 12:53:48.029000 audit: BPF prog-id=111 op=UNLOAD Dec 16 12:53:48.029000 audit[2639]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532306334656531343336663738363039373465316465353734376465 Dec 16 12:53:48.029000 audit: BPF prog-id=110 op=UNLOAD Dec 16 12:53:48.029000 audit[2639]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532306334656531343336663738363039373465316465353734376465 Dec 16 12:53:48.029000 audit: BPF prog-id=112 op=LOAD Dec 16 12:53:48.029000 audit[2639]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2489 pid=2639 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:48.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532306334656531343336663738363039373465316465353734376465 Dec 16 12:53:48.073255 containerd[1649]: time="2025-12-16T12:53:48.072990383Z" level=info msg="StartContainer for \"d42dc438497b3e08d78be8a4bd2cbee36675bb63623718b87ac591e9f5581928\" returns successfully" Dec 16 12:53:48.087438 kubelet[2438]: W1216 12:53:48.087373 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://77.42.41.174:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 77.42.41.174:6443: connect: connection refused Dec 16 12:53:48.087666 kubelet[2438]: E1216 12:53:48.087448 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://77.42.41.174:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 77.42.41.174:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:53:48.090398 containerd[1649]: time="2025-12-16T12:53:48.090317631Z" level=info msg="StartContainer for \"ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c\" returns successfully" Dec 16 12:53:48.100124 containerd[1649]: time="2025-12-16T12:53:48.099854838Z" level=info msg="StartContainer for \"520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c\" returns successfully" Dec 16 12:53:48.184474 kubelet[2438]: E1216 12:53:48.184448 2438 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:48.185362 kubelet[2438]: E1216 12:53:48.185303 2438 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:48.192035 kubelet[2438]: E1216 12:53:48.191927 2438 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:48.281846 kubelet[2438]: W1216 12:53:48.281781 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://77.42.41.174:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 77.42.41.174:6443: connect: connection refused Dec 16 12:53:48.283227 kubelet[2438]: E1216 12:53:48.283206 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://77.42.41.174:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 77.42.41.174:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:53:48.741373 kubelet[2438]: I1216 12:53:48.741341 2438 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:49.190699 kubelet[2438]: E1216 12:53:49.190352 2438 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:49.191344 kubelet[2438]: E1216 12:53:49.191233 2438 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:49.489875 kubelet[2438]: E1216 12:53:49.489759 2438 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4515-1-0-8-2e3d7ab7bb\" not found" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:49.534029 kubelet[2438]: E1216 12:53:49.533930 2438 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4515-1-0-8-2e3d7ab7bb.1881b3427fa106f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4515-1-0-8-2e3d7ab7bb,UID:ci-4515-1-0-8-2e3d7ab7bb,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-8-2e3d7ab7bb,},FirstTimestamp:2025-12-16 12:53:47.096418032 +0000 UTC m=+0.329370779,LastTimestamp:2025-12-16 12:53:47.096418032 +0000 UTC m=+0.329370779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-8-2e3d7ab7bb,}" Dec 16 12:53:49.587558 kubelet[2438]: E1216 12:53:49.587334 2438 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4515-1-0-8-2e3d7ab7bb.1881b3428230e09b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4515-1-0-8-2e3d7ab7bb,UID:ci-4515-1-0-8-2e3d7ab7bb,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-8-2e3d7ab7bb,},FirstTimestamp:2025-12-16 12:53:47.139399835 +0000 UTC m=+0.372352582,LastTimestamp:2025-12-16 12:53:47.139399835 +0000 UTC m=+0.372352582,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-8-2e3d7ab7bb,}" Dec 16 12:53:49.594860 kubelet[2438]: I1216 12:53:49.594711 2438 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:49.594860 kubelet[2438]: E1216 12:53:49.594738 2438 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4515-1-0-8-2e3d7ab7bb\": node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:49.606670 kubelet[2438]: E1216 12:53:49.606642 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:49.707732 kubelet[2438]: E1216 12:53:49.707685 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:49.808818 kubelet[2438]: E1216 12:53:49.808691 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:49.909098 kubelet[2438]: E1216 12:53:49.909037 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.009802 kubelet[2438]: E1216 12:53:50.009753 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.110908 kubelet[2438]: E1216 12:53:50.110793 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.211396 kubelet[2438]: E1216 12:53:50.211344 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.312434 kubelet[2438]: E1216 12:53:50.312390 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.413233 kubelet[2438]: E1216 12:53:50.413052 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.514351 kubelet[2438]: E1216 12:53:50.514225 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.615343 kubelet[2438]: E1216 12:53:50.615004 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.716496 kubelet[2438]: E1216 12:53:50.716003 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.817212 kubelet[2438]: E1216 12:53:50.817127 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:50.917786 kubelet[2438]: E1216 12:53:50.917741 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:51.018717 kubelet[2438]: E1216 12:53:51.018566 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:51.118897 kubelet[2438]: E1216 12:53:51.118851 2438 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:51.223249 kubelet[2438]: I1216 12:53:51.223214 2438 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:51.237211 kubelet[2438]: I1216 12:53:51.237132 2438 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:51.242740 kubelet[2438]: I1216 12:53:51.242656 2438 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:51.704923 kubelet[2438]: I1216 12:53:51.704750 2438 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:51.710549 kubelet[2438]: E1216 12:53:51.710509 2438 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:51.722377 systemd[1]: Reload requested from client PID 2709 ('systemctl') (unit session-7.scope)... Dec 16 12:53:51.722416 systemd[1]: Reloading... Dec 16 12:53:51.813214 zram_generator::config[2756]: No configuration found. Dec 16 12:53:51.999076 systemd[1]: Reloading finished in 276 ms. Dec 16 12:53:52.024316 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:53:52.035187 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 12:53:52.035812 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:53:52.037417 kernel: kauditd_printk_skb: 204 callbacks suppressed Dec 16 12:53:52.037468 kernel: audit: type=1131 audit(1765889632.034:395): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:52.034000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:52.037237 systemd[1]: kubelet.service: Consumed 619ms CPU time, 127.7M memory peak. Dec 16 12:53:52.046174 kernel: audit: type=1334 audit(1765889632.042:396): prog-id=113 op=LOAD Dec 16 12:53:52.042000 audit: BPF prog-id=113 op=LOAD Dec 16 12:53:52.043298 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:53:52.042000 audit: BPF prog-id=69 op=UNLOAD Dec 16 12:53:52.051026 kernel: audit: type=1334 audit(1765889632.042:397): prog-id=69 op=UNLOAD Dec 16 12:53:52.051082 kernel: audit: type=1334 audit(1765889632.042:398): prog-id=114 op=LOAD Dec 16 12:53:52.042000 audit: BPF prog-id=114 op=LOAD Dec 16 12:53:52.053541 kernel: audit: type=1334 audit(1765889632.042:399): prog-id=115 op=LOAD Dec 16 12:53:52.042000 audit: BPF prog-id=115 op=LOAD Dec 16 12:53:52.042000 audit: BPF prog-id=73 op=UNLOAD Dec 16 12:53:52.042000 audit: BPF prog-id=74 op=UNLOAD Dec 16 12:53:52.044000 audit: BPF prog-id=116 op=LOAD Dec 16 12:53:52.044000 audit: BPF prog-id=66 op=UNLOAD Dec 16 12:53:52.044000 audit: BPF prog-id=117 op=LOAD Dec 16 12:53:52.044000 audit: BPF prog-id=118 op=LOAD Dec 16 12:53:52.044000 audit: BPF prog-id=67 op=UNLOAD Dec 16 12:53:52.044000 audit: BPF prog-id=68 op=UNLOAD Dec 16 12:53:52.045000 audit: BPF prog-id=119 op=LOAD Dec 16 12:53:52.045000 audit: BPF prog-id=75 op=UNLOAD Dec 16 12:53:52.045000 audit: BPF prog-id=120 op=LOAD Dec 16 12:53:52.045000 audit: BPF prog-id=121 op=LOAD Dec 16 12:53:52.045000 audit: BPF prog-id=76 op=UNLOAD Dec 16 12:53:52.045000 audit: BPF prog-id=77 op=UNLOAD Dec 16 12:53:52.046000 audit: BPF prog-id=122 op=LOAD Dec 16 12:53:52.046000 audit: BPF prog-id=81 op=UNLOAD Dec 16 12:53:52.048000 audit: BPF prog-id=123 op=LOAD Dec 16 12:53:52.048000 audit: BPF prog-id=63 op=UNLOAD Dec 16 12:53:52.048000 audit: BPF prog-id=124 op=LOAD Dec 16 12:53:52.054174 kernel: audit: type=1334 audit(1765889632.042:400): prog-id=73 op=UNLOAD Dec 16 12:53:52.054200 kernel: audit: type=1334 audit(1765889632.042:401): prog-id=74 op=UNLOAD Dec 16 12:53:52.054216 kernel: audit: type=1334 audit(1765889632.044:402): prog-id=116 op=LOAD Dec 16 12:53:52.054231 kernel: audit: type=1334 audit(1765889632.044:403): prog-id=66 op=UNLOAD Dec 16 12:53:52.054248 kernel: audit: type=1334 audit(1765889632.044:404): prog-id=117 op=LOAD Dec 16 12:53:52.048000 audit: BPF prog-id=125 op=LOAD Dec 16 12:53:52.048000 audit: BPF prog-id=64 op=UNLOAD Dec 16 12:53:52.048000 audit: BPF prog-id=65 op=UNLOAD Dec 16 12:53:52.050000 audit: BPF prog-id=126 op=LOAD Dec 16 12:53:52.051000 audit: BPF prog-id=78 op=UNLOAD Dec 16 12:53:52.051000 audit: BPF prog-id=127 op=LOAD Dec 16 12:53:52.051000 audit: BPF prog-id=128 op=LOAD Dec 16 12:53:52.051000 audit: BPF prog-id=79 op=UNLOAD Dec 16 12:53:52.051000 audit: BPF prog-id=80 op=UNLOAD Dec 16 12:53:52.051000 audit: BPF prog-id=129 op=LOAD Dec 16 12:53:52.051000 audit: BPF prog-id=82 op=UNLOAD Dec 16 12:53:52.052000 audit: BPF prog-id=130 op=LOAD Dec 16 12:53:52.052000 audit: BPF prog-id=70 op=UNLOAD Dec 16 12:53:52.052000 audit: BPF prog-id=131 op=LOAD Dec 16 12:53:52.052000 audit: BPF prog-id=132 op=LOAD Dec 16 12:53:52.052000 audit: BPF prog-id=71 op=UNLOAD Dec 16 12:53:52.052000 audit: BPF prog-id=72 op=UNLOAD Dec 16 12:53:52.185033 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:53:52.185000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:53:52.192450 (kubelet)[2807]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:53:52.261697 kubelet[2807]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:53:52.261697 kubelet[2807]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:53:52.261697 kubelet[2807]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:53:52.261697 kubelet[2807]: I1216 12:53:52.260358 2807 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:53:52.266462 kubelet[2807]: I1216 12:53:52.266444 2807 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 16 12:53:52.266549 kubelet[2807]: I1216 12:53:52.266541 2807 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:53:52.266787 kubelet[2807]: I1216 12:53:52.266774 2807 server.go:954] "Client rotation is on, will bootstrap in background" Dec 16 12:53:52.269993 kubelet[2807]: I1216 12:53:52.269980 2807 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 16 12:53:52.279725 kubelet[2807]: I1216 12:53:52.279711 2807 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:53:52.288058 kubelet[2807]: I1216 12:53:52.288037 2807 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:53:52.291137 kubelet[2807]: I1216 12:53:52.291125 2807 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 12:53:52.291392 kubelet[2807]: I1216 12:53:52.291371 2807 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:53:52.291566 kubelet[2807]: I1216 12:53:52.291439 2807 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-8-2e3d7ab7bb","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:53:52.291665 kubelet[2807]: I1216 12:53:52.291656 2807 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:53:52.291709 kubelet[2807]: I1216 12:53:52.291703 2807 container_manager_linux.go:304] "Creating device plugin manager" Dec 16 12:53:52.291787 kubelet[2807]: I1216 12:53:52.291779 2807 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:53:52.291938 kubelet[2807]: I1216 12:53:52.291928 2807 kubelet.go:446] "Attempting to sync node with API server" Dec 16 12:53:52.292018 kubelet[2807]: I1216 12:53:52.291992 2807 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:53:52.292077 kubelet[2807]: I1216 12:53:52.292070 2807 kubelet.go:352] "Adding apiserver pod source" Dec 16 12:53:52.292121 kubelet[2807]: I1216 12:53:52.292115 2807 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:53:52.295279 kubelet[2807]: I1216 12:53:52.295249 2807 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:53:52.296037 kubelet[2807]: I1216 12:53:52.296020 2807 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 12:53:52.296438 kubelet[2807]: I1216 12:53:52.296419 2807 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 12:53:52.296585 kubelet[2807]: I1216 12:53:52.296479 2807 server.go:1287] "Started kubelet" Dec 16 12:53:52.299555 kubelet[2807]: I1216 12:53:52.299546 2807 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:53:52.304658 kubelet[2807]: I1216 12:53:52.304636 2807 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:53:52.305392 kubelet[2807]: I1216 12:53:52.305380 2807 server.go:479] "Adding debug handlers to kubelet server" Dec 16 12:53:52.306295 kubelet[2807]: I1216 12:53:52.306249 2807 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:53:52.306474 kubelet[2807]: I1216 12:53:52.306462 2807 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:53:52.306655 kubelet[2807]: I1216 12:53:52.306644 2807 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:53:52.307954 kubelet[2807]: I1216 12:53:52.307943 2807 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 12:53:52.308172 kubelet[2807]: E1216 12:53:52.308146 2807 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-8-2e3d7ab7bb\" not found" Dec 16 12:53:52.309586 kubelet[2807]: I1216 12:53:52.309574 2807 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 12:53:52.309741 kubelet[2807]: I1216 12:53:52.309732 2807 reconciler.go:26] "Reconciler: start to sync state" Dec 16 12:53:52.311374 kubelet[2807]: I1216 12:53:52.311346 2807 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 12:53:52.312258 kubelet[2807]: I1216 12:53:52.312246 2807 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 12:53:52.312631 kubelet[2807]: I1216 12:53:52.312413 2807 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 16 12:53:52.312631 kubelet[2807]: I1216 12:53:52.312434 2807 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:53:52.312631 kubelet[2807]: I1216 12:53:52.312447 2807 kubelet.go:2382] "Starting kubelet main sync loop" Dec 16 12:53:52.312631 kubelet[2807]: E1216 12:53:52.312488 2807 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:53:52.317171 kubelet[2807]: I1216 12:53:52.317128 2807 factory.go:221] Registration of the systemd container factory successfully Dec 16 12:53:52.317299 kubelet[2807]: I1216 12:53:52.317225 2807 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:53:52.325180 kubelet[2807]: I1216 12:53:52.324280 2807 factory.go:221] Registration of the containerd container factory successfully Dec 16 12:53:52.355252 kubelet[2807]: I1216 12:53:52.355214 2807 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:53:52.355252 kubelet[2807]: I1216 12:53:52.355229 2807 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:53:52.355252 kubelet[2807]: I1216 12:53:52.355245 2807 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:53:52.355416 kubelet[2807]: I1216 12:53:52.355393 2807 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 12:53:52.355437 kubelet[2807]: I1216 12:53:52.355406 2807 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 12:53:52.355437 kubelet[2807]: I1216 12:53:52.355427 2807 policy_none.go:49] "None policy: Start" Dec 16 12:53:52.355437 kubelet[2807]: I1216 12:53:52.355436 2807 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 12:53:52.355489 kubelet[2807]: I1216 12:53:52.355445 2807 state_mem.go:35] "Initializing new in-memory state store" Dec 16 12:53:52.355546 kubelet[2807]: I1216 12:53:52.355525 2807 state_mem.go:75] "Updated machine memory state" Dec 16 12:53:52.359686 kubelet[2807]: I1216 12:53:52.358686 2807 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 12:53:52.359686 kubelet[2807]: I1216 12:53:52.358798 2807 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:53:52.359686 kubelet[2807]: I1216 12:53:52.358806 2807 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:53:52.359686 kubelet[2807]: I1216 12:53:52.358955 2807 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:53:52.360635 kubelet[2807]: E1216 12:53:52.360510 2807 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:53:52.414446 kubelet[2807]: I1216 12:53:52.414403 2807 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.416183 kubelet[2807]: I1216 12:53:52.416089 2807 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.416961 kubelet[2807]: I1216 12:53:52.416886 2807 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.425954 kubelet[2807]: E1216 12:53:52.425925 2807 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.426299 kubelet[2807]: E1216 12:53:52.426282 2807 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" already exists" pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.426486 kubelet[2807]: E1216 12:53:52.426445 2807 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb\" already exists" pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.461996 kubelet[2807]: I1216 12:53:52.461927 2807 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.469613 kubelet[2807]: I1216 12:53:52.469582 2807 kubelet_node_status.go:124] "Node was previously registered" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.469845 kubelet[2807]: I1216 12:53:52.469824 2807 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.512055 kubelet[2807]: I1216 12:53:52.511684 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a45033eea666a2fd3bce119ad176bc71-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"a45033eea666a2fd3bce119ad176bc71\") " pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.512055 kubelet[2807]: I1216 12:53:52.511741 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/12e2af47f996b27ec9bf257ff79845f4-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"12e2af47f996b27ec9bf257ff79845f4\") " pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.512055 kubelet[2807]: I1216 12:53:52.511774 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/12e2af47f996b27ec9bf257ff79845f4-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"12e2af47f996b27ec9bf257ff79845f4\") " pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.512055 kubelet[2807]: I1216 12:53:52.511814 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.512055 kubelet[2807]: I1216 12:53:52.511859 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/12e2af47f996b27ec9bf257ff79845f4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"12e2af47f996b27ec9bf257ff79845f4\") " pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.512378 kubelet[2807]: I1216 12:53:52.511888 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.512378 kubelet[2807]: I1216 12:53:52.511918 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.512378 kubelet[2807]: I1216 12:53:52.511946 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:52.512378 kubelet[2807]: I1216 12:53:52.511970 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1f2e41c58f9938b008f633e63e0502f3-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb\" (UID: \"1f2e41c58f9938b008f633e63e0502f3\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:53.299927 kubelet[2807]: I1216 12:53:53.299806 2807 apiserver.go:52] "Watching apiserver" Dec 16 12:53:53.309779 kubelet[2807]: I1216 12:53:53.309739 2807 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 12:53:53.345533 kubelet[2807]: I1216 12:53:53.345454 2807 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:53.353274 kubelet[2807]: E1216 12:53:53.353251 2807 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:53:53.366233 kubelet[2807]: I1216 12:53:53.366184 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" podStartSLOduration=2.366144624 podStartE2EDuration="2.366144624s" podCreationTimestamp="2025-12-16 12:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:53:53.365776871 +0000 UTC m=+1.151430875" watchObservedRunningTime="2025-12-16 12:53:53.366144624 +0000 UTC m=+1.151798627" Dec 16 12:53:53.384510 kubelet[2807]: I1216 12:53:53.383380 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4515-1-0-8-2e3d7ab7bb" podStartSLOduration=2.383231386 podStartE2EDuration="2.383231386s" podCreationTimestamp="2025-12-16 12:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:53:53.376042385 +0000 UTC m=+1.161696388" watchObservedRunningTime="2025-12-16 12:53:53.383231386 +0000 UTC m=+1.168885389" Dec 16 12:53:53.384510 kubelet[2807]: I1216 12:53:53.383754 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4515-1-0-8-2e3d7ab7bb" podStartSLOduration=2.38374464 podStartE2EDuration="2.38374464s" podCreationTimestamp="2025-12-16 12:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:53:53.38362087 +0000 UTC m=+1.169274873" watchObservedRunningTime="2025-12-16 12:53:53.38374464 +0000 UTC m=+1.169398653" Dec 16 12:53:56.201780 kubelet[2807]: I1216 12:53:56.201723 2807 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 12:53:56.203055 kubelet[2807]: I1216 12:53:56.202945 2807 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 12:53:56.203114 containerd[1649]: time="2025-12-16T12:53:56.202605385Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 12:53:57.196041 systemd[1]: Created slice kubepods-besteffort-pod450c542a_d8c5_4112_9343_c1991007b682.slice - libcontainer container kubepods-besteffort-pod450c542a_d8c5_4112_9343_c1991007b682.slice. Dec 16 12:53:57.242868 kubelet[2807]: I1216 12:53:57.242769 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7d97\" (UniqueName: \"kubernetes.io/projected/450c542a-d8c5-4112-9343-c1991007b682-kube-api-access-n7d97\") pod \"kube-proxy-6989l\" (UID: \"450c542a-d8c5-4112-9343-c1991007b682\") " pod="kube-system/kube-proxy-6989l" Dec 16 12:53:57.242868 kubelet[2807]: I1216 12:53:57.242832 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/450c542a-d8c5-4112-9343-c1991007b682-kube-proxy\") pod \"kube-proxy-6989l\" (UID: \"450c542a-d8c5-4112-9343-c1991007b682\") " pod="kube-system/kube-proxy-6989l" Dec 16 12:53:57.242868 kubelet[2807]: I1216 12:53:57.242866 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/450c542a-d8c5-4112-9343-c1991007b682-xtables-lock\") pod \"kube-proxy-6989l\" (UID: \"450c542a-d8c5-4112-9343-c1991007b682\") " pod="kube-system/kube-proxy-6989l" Dec 16 12:53:57.243596 kubelet[2807]: I1216 12:53:57.242888 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/450c542a-d8c5-4112-9343-c1991007b682-lib-modules\") pod \"kube-proxy-6989l\" (UID: \"450c542a-d8c5-4112-9343-c1991007b682\") " pod="kube-system/kube-proxy-6989l" Dec 16 12:53:57.339604 systemd[1]: Created slice kubepods-besteffort-pod06a450cd_e4ff_47f9_8c2c_ce6bc5fdbb05.slice - libcontainer container kubepods-besteffort-pod06a450cd_e4ff_47f9_8c2c_ce6bc5fdbb05.slice. Dec 16 12:53:57.343094 kubelet[2807]: I1216 12:53:57.343048 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/06a450cd-e4ff-47f9-8c2c-ce6bc5fdbb05-var-lib-calico\") pod \"tigera-operator-7dcd859c48-9ldhb\" (UID: \"06a450cd-e4ff-47f9-8c2c-ce6bc5fdbb05\") " pod="tigera-operator/tigera-operator-7dcd859c48-9ldhb" Dec 16 12:53:57.344066 kubelet[2807]: I1216 12:53:57.343124 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6t6d\" (UniqueName: \"kubernetes.io/projected/06a450cd-e4ff-47f9-8c2c-ce6bc5fdbb05-kube-api-access-w6t6d\") pod \"tigera-operator-7dcd859c48-9ldhb\" (UID: \"06a450cd-e4ff-47f9-8c2c-ce6bc5fdbb05\") " pod="tigera-operator/tigera-operator-7dcd859c48-9ldhb" Dec 16 12:53:57.503863 containerd[1649]: time="2025-12-16T12:53:57.503734595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6989l,Uid:450c542a-d8c5-4112-9343-c1991007b682,Namespace:kube-system,Attempt:0,}" Dec 16 12:53:57.530492 containerd[1649]: time="2025-12-16T12:53:57.529824918Z" level=info msg="connecting to shim 7a0e7f2c79aa1cd760eb34ab926fbc00c7b9c5d6513822cbc8b97a82c8f6466d" address="unix:///run/containerd/s/6486be5255e9daf9b5108c4ca445ad0a8643c6c5c4bdfd3fcd91abb4e3ade96b" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:53:57.567649 systemd[1]: Started cri-containerd-7a0e7f2c79aa1cd760eb34ab926fbc00c7b9c5d6513822cbc8b97a82c8f6466d.scope - libcontainer container 7a0e7f2c79aa1cd760eb34ab926fbc00c7b9c5d6513822cbc8b97a82c8f6466d. Dec 16 12:53:57.601668 kernel: kauditd_printk_skb: 32 callbacks suppressed Dec 16 12:53:57.601858 kernel: audit: type=1334 audit(1765889637.595:437): prog-id=133 op=LOAD Dec 16 12:53:57.595000 audit: BPF prog-id=133 op=LOAD Dec 16 12:53:57.601000 audit: BPF prog-id=134 op=LOAD Dec 16 12:53:57.601000 audit[2873]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.608306 kernel: audit: type=1334 audit(1765889637.601:438): prog-id=134 op=LOAD Dec 16 12:53:57.608376 kernel: audit: type=1300 audit(1765889637.601:438): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.619273 kernel: audit: type=1327 audit(1765889637.601:438): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.602000 audit: BPF prog-id=134 op=UNLOAD Dec 16 12:53:57.645264 kernel: audit: type=1334 audit(1765889637.602:439): prog-id=134 op=UNLOAD Dec 16 12:53:57.645349 kernel: audit: type=1300 audit(1765889637.602:439): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.602000 audit[2873]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.647050 containerd[1649]: time="2025-12-16T12:53:57.645602747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-9ldhb,Uid:06a450cd-e4ff-47f9-8c2c-ce6bc5fdbb05,Namespace:tigera-operator,Attempt:0,}" Dec 16 12:53:57.602000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.603000 audit: BPF prog-id=135 op=LOAD Dec 16 12:53:57.663716 kernel: audit: type=1327 audit(1765889637.602:439): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.663775 kernel: audit: type=1334 audit(1765889637.603:440): prog-id=135 op=LOAD Dec 16 12:53:57.666238 kernel: audit: type=1300 audit(1765889637.603:440): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.603000 audit[2873]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.672102 containerd[1649]: time="2025-12-16T12:53:57.671927099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6989l,Uid:450c542a-d8c5-4112-9343-c1991007b682,Namespace:kube-system,Attempt:0,} returns sandbox id \"7a0e7f2c79aa1cd760eb34ab926fbc00c7b9c5d6513822cbc8b97a82c8f6466d\"" Dec 16 12:53:57.678589 containerd[1649]: time="2025-12-16T12:53:57.678419745Z" level=info msg="CreateContainer within sandbox \"7a0e7f2c79aa1cd760eb34ab926fbc00c7b9c5d6513822cbc8b97a82c8f6466d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 12:53:57.603000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.687451 containerd[1649]: time="2025-12-16T12:53:57.687296558Z" level=info msg="connecting to shim da73234bd5dfc4b037af8db80fb8bd2d06a9323a8605c68e6cb24b4165e40df0" address="unix:///run/containerd/s/76b06cbbd907652c54a017a70d7ac1079bf32f5b9bef7046055052dda33cdc16" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:53:57.691216 kernel: audit: type=1327 audit(1765889637.603:440): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.603000 audit: BPF prog-id=136 op=LOAD Dec 16 12:53:57.603000 audit[2873]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.603000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.603000 audit: BPF prog-id=136 op=UNLOAD Dec 16 12:53:57.603000 audit[2873]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.603000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.603000 audit: BPF prog-id=135 op=UNLOAD Dec 16 12:53:57.603000 audit[2873]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.603000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.603000 audit: BPF prog-id=137 op=LOAD Dec 16 12:53:57.603000 audit[2873]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2862 pid=2873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.603000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761306537663263373961613163643736306562333461623932366662 Dec 16 12:53:57.694876 containerd[1649]: time="2025-12-16T12:53:57.694730155Z" level=info msg="Container 572c106037c8a4787831c3f5a665682b126cf3fdcf0d01c1b4ec6aba645b3612: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:53:57.703412 containerd[1649]: time="2025-12-16T12:53:57.703387606Z" level=info msg="CreateContainer within sandbox \"7a0e7f2c79aa1cd760eb34ab926fbc00c7b9c5d6513822cbc8b97a82c8f6466d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"572c106037c8a4787831c3f5a665682b126cf3fdcf0d01c1b4ec6aba645b3612\"" Dec 16 12:53:57.706177 containerd[1649]: time="2025-12-16T12:53:57.706147418Z" level=info msg="StartContainer for \"572c106037c8a4787831c3f5a665682b126cf3fdcf0d01c1b4ec6aba645b3612\"" Dec 16 12:53:57.708728 containerd[1649]: time="2025-12-16T12:53:57.708696904Z" level=info msg="connecting to shim 572c106037c8a4787831c3f5a665682b126cf3fdcf0d01c1b4ec6aba645b3612" address="unix:///run/containerd/s/6486be5255e9daf9b5108c4ca445ad0a8643c6c5c4bdfd3fcd91abb4e3ade96b" protocol=ttrpc version=3 Dec 16 12:53:57.715298 systemd[1]: Started cri-containerd-da73234bd5dfc4b037af8db80fb8bd2d06a9323a8605c68e6cb24b4165e40df0.scope - libcontainer container da73234bd5dfc4b037af8db80fb8bd2d06a9323a8605c68e6cb24b4165e40df0. Dec 16 12:53:57.724202 systemd[1]: Started cri-containerd-572c106037c8a4787831c3f5a665682b126cf3fdcf0d01c1b4ec6aba645b3612.scope - libcontainer container 572c106037c8a4787831c3f5a665682b126cf3fdcf0d01c1b4ec6aba645b3612. Dec 16 12:53:57.725000 audit: BPF prog-id=138 op=LOAD Dec 16 12:53:57.726000 audit: BPF prog-id=139 op=LOAD Dec 16 12:53:57.726000 audit[2918]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00019c238 a2=98 a3=0 items=0 ppid=2907 pid=2918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.726000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461373332333462643564666334623033376166386462383066623862 Dec 16 12:53:57.726000 audit: BPF prog-id=139 op=UNLOAD Dec 16 12:53:57.726000 audit[2918]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=2918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.726000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461373332333462643564666334623033376166386462383066623862 Dec 16 12:53:57.726000 audit: BPF prog-id=140 op=LOAD Dec 16 12:53:57.726000 audit[2918]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00019c488 a2=98 a3=0 items=0 ppid=2907 pid=2918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.726000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461373332333462643564666334623033376166386462383066623862 Dec 16 12:53:57.726000 audit: BPF prog-id=141 op=LOAD Dec 16 12:53:57.726000 audit[2918]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00019c218 a2=98 a3=0 items=0 ppid=2907 pid=2918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.726000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461373332333462643564666334623033376166386462383066623862 Dec 16 12:53:57.726000 audit: BPF prog-id=141 op=UNLOAD Dec 16 12:53:57.726000 audit[2918]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=2918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.726000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461373332333462643564666334623033376166386462383066623862 Dec 16 12:53:57.726000 audit: BPF prog-id=140 op=UNLOAD Dec 16 12:53:57.726000 audit[2918]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=2918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.726000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461373332333462643564666334623033376166386462383066623862 Dec 16 12:53:57.726000 audit: BPF prog-id=142 op=LOAD Dec 16 12:53:57.726000 audit[2918]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00019c6e8 a2=98 a3=0 items=0 ppid=2907 pid=2918 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.726000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6461373332333462643564666334623033376166386462383066623862 Dec 16 12:53:57.759918 containerd[1649]: time="2025-12-16T12:53:57.759801187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-9ldhb,Uid:06a450cd-e4ff-47f9-8c2c-ce6bc5fdbb05,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"da73234bd5dfc4b037af8db80fb8bd2d06a9323a8605c68e6cb24b4165e40df0\"" Dec 16 12:53:57.761940 containerd[1649]: time="2025-12-16T12:53:57.761818860Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 12:53:57.764000 audit: BPF prog-id=143 op=LOAD Dec 16 12:53:57.764000 audit[2929]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2862 pid=2929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537326331303630333763386134373837383331633366356136363536 Dec 16 12:53:57.764000 audit: BPF prog-id=144 op=LOAD Dec 16 12:53:57.764000 audit[2929]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2862 pid=2929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537326331303630333763386134373837383331633366356136363536 Dec 16 12:53:57.764000 audit: BPF prog-id=144 op=UNLOAD Dec 16 12:53:57.764000 audit[2929]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2862 pid=2929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537326331303630333763386134373837383331633366356136363536 Dec 16 12:53:57.764000 audit: BPF prog-id=143 op=UNLOAD Dec 16 12:53:57.764000 audit[2929]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2862 pid=2929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537326331303630333763386134373837383331633366356136363536 Dec 16 12:53:57.764000 audit: BPF prog-id=145 op=LOAD Dec 16 12:53:57.764000 audit[2929]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2862 pid=2929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:57.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537326331303630333763386134373837383331633366356136363536 Dec 16 12:53:57.781654 containerd[1649]: time="2025-12-16T12:53:57.781619697Z" level=info msg="StartContainer for \"572c106037c8a4787831c3f5a665682b126cf3fdcf0d01c1b4ec6aba645b3612\" returns successfully" Dec 16 12:53:58.072000 audit[3008]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3008 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.072000 audit[3008]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffd6c26050 a2=0 a3=7fffd6c2603c items=0 ppid=2949 pid=3008 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.072000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:53:58.073000 audit[3010]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3010 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.073000 audit[3010]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff3a85c760 a2=0 a3=7fff3a85c74c items=0 ppid=2949 pid=3010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.073000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:53:58.074000 audit[3011]: NETFILTER_CFG table=nat:56 family=10 entries=1 op=nft_register_chain pid=3011 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.074000 audit[3012]: NETFILTER_CFG table=nat:57 family=2 entries=1 op=nft_register_chain pid=3012 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.074000 audit[3011]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdac3b6b10 a2=0 a3=7ffdac3b6afc items=0 ppid=2949 pid=3011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.074000 audit[3012]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcc07d2420 a2=0 a3=7ffcc07d240c items=0 ppid=2949 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.074000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:53:58.074000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:53:58.076000 audit[3013]: NETFILTER_CFG table=filter:58 family=2 entries=1 op=nft_register_chain pid=3013 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.076000 audit[3013]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff799f1b60 a2=0 a3=7fff799f1b4c items=0 ppid=2949 pid=3013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.076000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:53:58.078000 audit[3014]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3014 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.078000 audit[3014]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc9008d3f0 a2=0 a3=7ffc9008d3dc items=0 ppid=2949 pid=3014 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.078000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:53:58.183000 audit[3015]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3015 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.183000 audit[3015]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffdaadd0410 a2=0 a3=7ffdaadd03fc items=0 ppid=2949 pid=3015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.183000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:53:58.188000 audit[3017]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3017 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.188000 audit[3017]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff1c27fe10 a2=0 a3=7fff1c27fdfc items=0 ppid=2949 pid=3017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.188000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Dec 16 12:53:58.192000 audit[3020]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3020 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.192000 audit[3020]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffdcb62e9f0 a2=0 a3=7ffdcb62e9dc items=0 ppid=2949 pid=3020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.192000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Dec 16 12:53:58.193000 audit[3021]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3021 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.193000 audit[3021]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff84d191e0 a2=0 a3=7fff84d191cc items=0 ppid=2949 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.193000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:53:58.196000 audit[3023]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3023 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.196000 audit[3023]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffdca5585e0 a2=0 a3=7ffdca5585cc items=0 ppid=2949 pid=3023 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.196000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:53:58.197000 audit[3024]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3024 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.197000 audit[3024]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdf0fae7a0 a2=0 a3=7ffdf0fae78c items=0 ppid=2949 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.197000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:53:58.200000 audit[3026]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3026 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.200000 audit[3026]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd2dbd20c0 a2=0 a3=7ffd2dbd20ac items=0 ppid=2949 pid=3026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.200000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Dec 16 12:53:58.206000 audit[3029]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.206000 audit[3029]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd45dfc190 a2=0 a3=7ffd45dfc17c items=0 ppid=2949 pid=3029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.206000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Dec 16 12:53:58.208000 audit[3030]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3030 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.208000 audit[3030]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe0e44ebe0 a2=0 a3=7ffe0e44ebcc items=0 ppid=2949 pid=3030 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.208000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:53:58.211000 audit[3032]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3032 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.211000 audit[3032]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe23798260 a2=0 a3=7ffe2379824c items=0 ppid=2949 pid=3032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.211000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:53:58.213000 audit[3033]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.213000 audit[3033]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcb9c85780 a2=0 a3=7ffcb9c8576c items=0 ppid=2949 pid=3033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.213000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:53:58.217000 audit[3035]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.217000 audit[3035]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdfdd972d0 a2=0 a3=7ffdfdd972bc items=0 ppid=2949 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.217000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Dec 16 12:53:58.221000 audit[3038]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3038 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.221000 audit[3038]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff799958f0 a2=0 a3=7fff799958dc items=0 ppid=2949 pid=3038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.221000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Dec 16 12:53:58.227000 audit[3041]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.227000 audit[3041]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe86c864b0 a2=0 a3=7ffe86c8649c items=0 ppid=2949 pid=3041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.227000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Dec 16 12:53:58.229000 audit[3042]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3042 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.229000 audit[3042]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffdb9da3570 a2=0 a3=7ffdb9da355c items=0 ppid=2949 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.229000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:53:58.234000 audit[3044]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3044 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.234000 audit[3044]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffd66ef7bf0 a2=0 a3=7ffd66ef7bdc items=0 ppid=2949 pid=3044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.234000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:53:58.238000 audit[3047]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3047 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.238000 audit[3047]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcd0564dd0 a2=0 a3=7ffcd0564dbc items=0 ppid=2949 pid=3047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.238000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:53:58.239000 audit[3048]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.239000 audit[3048]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffc1978810 a2=0 a3=7fffc19787fc items=0 ppid=2949 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.239000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:53:58.243000 audit[3050]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:53:58.243000 audit[3050]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7fff362cd610 a2=0 a3=7fff362cd5fc items=0 ppid=2949 pid=3050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.243000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:53:58.268000 audit[3056]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3056 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:53:58.268000 audit[3056]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd08395630 a2=0 a3=7ffd0839561c items=0 ppid=2949 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.268000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:53:58.279000 audit[3056]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3056 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:53:58.279000 audit[3056]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffd08395630 a2=0 a3=7ffd0839561c items=0 ppid=2949 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.279000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:53:58.280000 audit[3061]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3061 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.280000 audit[3061]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fffac08b000 a2=0 a3=7fffac08afec items=0 ppid=2949 pid=3061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.280000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:53:58.284000 audit[3063]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3063 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.284000 audit[3063]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7fffaf6d5ff0 a2=0 a3=7fffaf6d5fdc items=0 ppid=2949 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.284000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Dec 16 12:53:58.290000 audit[3066]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3066 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.290000 audit[3066]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fffab9e4680 a2=0 a3=7fffab9e466c items=0 ppid=2949 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.290000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Dec 16 12:53:58.291000 audit[3067]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3067 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.291000 audit[3067]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffde18a5420 a2=0 a3=7ffde18a540c items=0 ppid=2949 pid=3067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.291000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:53:58.296000 audit[3069]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3069 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.296000 audit[3069]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe5fb9a5e0 a2=0 a3=7ffe5fb9a5cc items=0 ppid=2949 pid=3069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.296000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:53:58.298000 audit[3070]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3070 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.298000 audit[3070]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcd3603c70 a2=0 a3=7ffcd3603c5c items=0 ppid=2949 pid=3070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.298000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:53:58.303000 audit[3072]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3072 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.303000 audit[3072]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe43fe1590 a2=0 a3=7ffe43fe157c items=0 ppid=2949 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.303000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Dec 16 12:53:58.308000 audit[3075]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3075 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.308000 audit[3075]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffe05724090 a2=0 a3=7ffe0572407c items=0 ppid=2949 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.308000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Dec 16 12:53:58.310000 audit[3076]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3076 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.310000 audit[3076]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffef6a6e160 a2=0 a3=7ffef6a6e14c items=0 ppid=2949 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.310000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:53:58.315000 audit[3078]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3078 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.315000 audit[3078]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc2a42e8d0 a2=0 a3=7ffc2a42e8bc items=0 ppid=2949 pid=3078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.315000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:53:58.317000 audit[3079]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3079 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.317000 audit[3079]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffef6d336d0 a2=0 a3=7ffef6d336bc items=0 ppid=2949 pid=3079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.317000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:53:58.321000 audit[3081]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3081 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.321000 audit[3081]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdf710efa0 a2=0 a3=7ffdf710ef8c items=0 ppid=2949 pid=3081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.321000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Dec 16 12:53:58.327000 audit[3084]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3084 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.327000 audit[3084]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffff9d29f20 a2=0 a3=7ffff9d29f0c items=0 ppid=2949 pid=3084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.327000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Dec 16 12:53:58.332000 audit[3087]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3087 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.332000 audit[3087]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff2f942330 a2=0 a3=7fff2f94231c items=0 ppid=2949 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.332000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Dec 16 12:53:58.333000 audit[3088]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.333000 audit[3088]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe67459910 a2=0 a3=7ffe674598fc items=0 ppid=2949 pid=3088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.333000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:53:58.336000 audit[3090]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.336000 audit[3090]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffd633c40c0 a2=0 a3=7ffd633c40ac items=0 ppid=2949 pid=3090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.336000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:53:58.339000 audit[3093]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3093 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.339000 audit[3093]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc0c230550 a2=0 a3=7ffc0c23053c items=0 ppid=2949 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.339000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:53:58.341000 audit[3094]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.341000 audit[3094]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe889e4b80 a2=0 a3=7ffe889e4b6c items=0 ppid=2949 pid=3094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.341000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:53:58.344000 audit[3096]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.344000 audit[3096]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffdd4ac05e0 a2=0 a3=7ffdd4ac05cc items=0 ppid=2949 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.344000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:53:58.346000 audit[3097]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.346000 audit[3097]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff7b496e20 a2=0 a3=7fff7b496e0c items=0 ppid=2949 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.346000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:53:58.348000 audit[3099]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.348000 audit[3099]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffdeadbde70 a2=0 a3=7ffdeadbde5c items=0 ppid=2949 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.348000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:53:58.352000 audit[3102]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:53:58.352000 audit[3102]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffecc8cb5e0 a2=0 a3=7ffecc8cb5cc items=0 ppid=2949 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.352000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:53:58.355000 audit[3104]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:53:58.355000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffc3428eec0 a2=0 a3=7ffc3428eeac items=0 ppid=2949 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.355000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:53:58.357000 audit[3104]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:53:58.357000 audit[3104]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffc3428eec0 a2=0 a3=7ffc3428eeac items=0 ppid=2949 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:53:58.357000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:53:59.567438 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1756135712.mount: Deactivated successfully. Dec 16 12:54:01.018236 containerd[1649]: time="2025-12-16T12:54:01.018181398Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:01.019407 containerd[1649]: time="2025-12-16T12:54:01.019297113Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Dec 16 12:54:01.020244 containerd[1649]: time="2025-12-16T12:54:01.020216843Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:01.022473 containerd[1649]: time="2025-12-16T12:54:01.022439675Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:01.023116 containerd[1649]: time="2025-12-16T12:54:01.023092826Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 3.261247725s" Dec 16 12:54:01.023246 containerd[1649]: time="2025-12-16T12:54:01.023227974Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 16 12:54:01.027932 containerd[1649]: time="2025-12-16T12:54:01.027872531Z" level=info msg="CreateContainer within sandbox \"da73234bd5dfc4b037af8db80fb8bd2d06a9323a8605c68e6cb24b4165e40df0\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 12:54:01.034750 containerd[1649]: time="2025-12-16T12:54:01.034329946Z" level=info msg="Container dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:54:01.047811 containerd[1649]: time="2025-12-16T12:54:01.047772037Z" level=info msg="CreateContainer within sandbox \"da73234bd5dfc4b037af8db80fb8bd2d06a9323a8605c68e6cb24b4165e40df0\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615\"" Dec 16 12:54:01.048975 containerd[1649]: time="2025-12-16T12:54:01.048433442Z" level=info msg="StartContainer for \"dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615\"" Dec 16 12:54:01.049852 containerd[1649]: time="2025-12-16T12:54:01.049826588Z" level=info msg="connecting to shim dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615" address="unix:///run/containerd/s/76b06cbbd907652c54a017a70d7ac1079bf32f5b9bef7046055052dda33cdc16" protocol=ttrpc version=3 Dec 16 12:54:01.074315 systemd[1]: Started cri-containerd-dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615.scope - libcontainer container dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615. Dec 16 12:54:01.084000 audit: BPF prog-id=146 op=LOAD Dec 16 12:54:01.084000 audit: BPF prog-id=147 op=LOAD Dec 16 12:54:01.084000 audit[3113]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2907 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:01.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463623062376336393031383330356665636537303037346565333264 Dec 16 12:54:01.084000 audit: BPF prog-id=147 op=UNLOAD Dec 16 12:54:01.084000 audit[3113]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:01.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463623062376336393031383330356665636537303037346565333264 Dec 16 12:54:01.084000 audit: BPF prog-id=148 op=LOAD Dec 16 12:54:01.084000 audit[3113]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2907 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:01.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463623062376336393031383330356665636537303037346565333264 Dec 16 12:54:01.084000 audit: BPF prog-id=149 op=LOAD Dec 16 12:54:01.084000 audit[3113]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2907 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:01.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463623062376336393031383330356665636537303037346565333264 Dec 16 12:54:01.084000 audit: BPF prog-id=149 op=UNLOAD Dec 16 12:54:01.084000 audit[3113]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:01.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463623062376336393031383330356665636537303037346565333264 Dec 16 12:54:01.084000 audit: BPF prog-id=148 op=UNLOAD Dec 16 12:54:01.084000 audit[3113]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:01.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463623062376336393031383330356665636537303037346565333264 Dec 16 12:54:01.084000 audit: BPF prog-id=150 op=LOAD Dec 16 12:54:01.084000 audit[3113]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2907 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:01.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463623062376336393031383330356665636537303037346565333264 Dec 16 12:54:01.098423 containerd[1649]: time="2025-12-16T12:54:01.098359770Z" level=info msg="StartContainer for \"dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615\" returns successfully" Dec 16 12:54:01.382858 kubelet[2807]: I1216 12:54:01.381891 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-6989l" podStartSLOduration=4.381873281 podStartE2EDuration="4.381873281s" podCreationTimestamp="2025-12-16 12:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:53:58.38103766 +0000 UTC m=+6.166691723" watchObservedRunningTime="2025-12-16 12:54:01.381873281 +0000 UTC m=+9.167527294" Dec 16 12:54:01.382858 kubelet[2807]: I1216 12:54:01.381983 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-9ldhb" podStartSLOduration=1.119202648 podStartE2EDuration="4.38197678s" podCreationTimestamp="2025-12-16 12:53:57 +0000 UTC" firstStartedPulling="2025-12-16 12:53:57.761291495 +0000 UTC m=+5.546945498" lastFinishedPulling="2025-12-16 12:54:01.024065627 +0000 UTC m=+8.809719630" observedRunningTime="2025-12-16 12:54:01.38194054 +0000 UTC m=+9.167594593" watchObservedRunningTime="2025-12-16 12:54:01.38197678 +0000 UTC m=+9.167630782" Dec 16 12:54:03.683782 systemd[1]: cri-containerd-dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615.scope: Deactivated successfully. Dec 16 12:54:03.691468 kernel: kauditd_printk_skb: 224 callbacks suppressed Dec 16 12:54:03.691542 kernel: audit: type=1334 audit(1765889643.688:517): prog-id=146 op=UNLOAD Dec 16 12:54:03.688000 audit: BPF prog-id=146 op=UNLOAD Dec 16 12:54:03.688000 audit: BPF prog-id=150 op=UNLOAD Dec 16 12:54:03.696224 kernel: audit: type=1334 audit(1765889643.688:518): prog-id=150 op=UNLOAD Dec 16 12:54:03.718734 containerd[1649]: time="2025-12-16T12:54:03.718682555Z" level=info msg="received container exit event container_id:\"dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615\" id:\"dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615\" pid:3129 exit_status:1 exited_at:{seconds:1765889643 nanos:687635353}" Dec 16 12:54:03.755008 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615-rootfs.mount: Deactivated successfully. Dec 16 12:54:04.387185 kubelet[2807]: I1216 12:54:04.386347 2807 scope.go:117] "RemoveContainer" containerID="dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615" Dec 16 12:54:04.391487 containerd[1649]: time="2025-12-16T12:54:04.389031309Z" level=info msg="CreateContainer within sandbox \"da73234bd5dfc4b037af8db80fb8bd2d06a9323a8605c68e6cb24b4165e40df0\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 12:54:04.408275 containerd[1649]: time="2025-12-16T12:54:04.407691990Z" level=info msg="Container f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:54:04.417337 containerd[1649]: time="2025-12-16T12:54:04.417294229Z" level=info msg="CreateContainer within sandbox \"da73234bd5dfc4b037af8db80fb8bd2d06a9323a8605c68e6cb24b4165e40df0\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157\"" Dec 16 12:54:04.419947 containerd[1649]: time="2025-12-16T12:54:04.419919133Z" level=info msg="StartContainer for \"f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157\"" Dec 16 12:54:04.423510 containerd[1649]: time="2025-12-16T12:54:04.422724172Z" level=info msg="connecting to shim f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157" address="unix:///run/containerd/s/76b06cbbd907652c54a017a70d7ac1079bf32f5b9bef7046055052dda33cdc16" protocol=ttrpc version=3 Dec 16 12:54:04.446350 systemd[1]: Started cri-containerd-f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157.scope - libcontainer container f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157. Dec 16 12:54:04.463287 kernel: audit: type=1334 audit(1765889644.460:519): prog-id=151 op=LOAD Dec 16 12:54:04.460000 audit: BPF prog-id=151 op=LOAD Dec 16 12:54:04.463000 audit: BPF prog-id=152 op=LOAD Dec 16 12:54:04.467313 kernel: audit: type=1334 audit(1765889644.463:520): prog-id=152 op=LOAD Dec 16 12:54:04.463000 audit[3188]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2907 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:04.475185 kernel: audit: type=1300 audit(1765889644.463:520): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2907 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:04.483134 kernel: audit: type=1327 audit(1765889644.463:520): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634376463393065356566336563316632663563346333623038386636 Dec 16 12:54:04.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634376463393065356566336563316632663563346333623038386636 Dec 16 12:54:04.463000 audit: BPF prog-id=152 op=UNLOAD Dec 16 12:54:04.492371 kernel: audit: type=1334 audit(1765889644.463:521): prog-id=152 op=UNLOAD Dec 16 12:54:04.492423 kernel: audit: type=1300 audit(1765889644.463:521): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:04.463000 audit[3188]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:04.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634376463393065356566336563316632663563346333623038386636 Dec 16 12:54:04.500466 kernel: audit: type=1327 audit(1765889644.463:521): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634376463393065356566336563316632663563346333623038386636 Dec 16 12:54:04.463000 audit: BPF prog-id=153 op=LOAD Dec 16 12:54:04.504310 kernel: audit: type=1334 audit(1765889644.463:522): prog-id=153 op=LOAD Dec 16 12:54:04.463000 audit[3188]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2907 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:04.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634376463393065356566336563316632663563346333623038386636 Dec 16 12:54:04.463000 audit: BPF prog-id=154 op=LOAD Dec 16 12:54:04.463000 audit[3188]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2907 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:04.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634376463393065356566336563316632663563346333623038386636 Dec 16 12:54:04.463000 audit: BPF prog-id=154 op=UNLOAD Dec 16 12:54:04.463000 audit[3188]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:04.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634376463393065356566336563316632663563346333623038386636 Dec 16 12:54:04.463000 audit: BPF prog-id=153 op=UNLOAD Dec 16 12:54:04.463000 audit[3188]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2907 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:04.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634376463393065356566336563316632663563346333623038386636 Dec 16 12:54:04.463000 audit: BPF prog-id=155 op=LOAD Dec 16 12:54:04.463000 audit[3188]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2907 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:04.463000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634376463393065356566336563316632663563346333623038386636 Dec 16 12:54:04.525065 containerd[1649]: time="2025-12-16T12:54:04.525012146Z" level=info msg="StartContainer for \"f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157\" returns successfully" Dec 16 12:54:05.890382 update_engine[1619]: I20251216 12:54:05.889513 1619 update_attempter.cc:509] Updating boot flags... Dec 16 12:54:06.806270 sudo[1869]: pam_unix(sudo:session): session closed for user root Dec 16 12:54:06.805000 audit[1869]: USER_END pid=1869 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:54:06.805000 audit[1869]: CRED_DISP pid=1869 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:54:06.976251 sshd[1868]: Connection closed by 147.75.109.163 port 60086 Dec 16 12:54:06.977390 sshd-session[1865]: pam_unix(sshd:session): session closed for user core Dec 16 12:54:06.977000 audit[1865]: USER_END pid=1865 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:54:06.977000 audit[1865]: CRED_DISP pid=1865 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:54:06.981233 systemd-logind[1616]: Session 7 logged out. Waiting for processes to exit. Dec 16 12:54:06.981475 systemd[1]: sshd@6-77.42.41.174:22-147.75.109.163:60086.service: Deactivated successfully. Dec 16 12:54:06.980000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.41.174:22-147.75.109.163:60086 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:54:06.983403 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 12:54:06.983628 systemd[1]: session-7.scope: Consumed 3.854s CPU time, 156.8M memory peak. Dec 16 12:54:06.985482 systemd-logind[1616]: Removed session 7. Dec 16 12:54:08.696000 audit[3264]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3264 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:08.698793 kernel: kauditd_printk_skb: 19 callbacks suppressed Dec 16 12:54:08.698869 kernel: audit: type=1325 audit(1765889648.696:532): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3264 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:08.696000 audit[3264]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffde748bb20 a2=0 a3=7ffde748bb0c items=0 ppid=2949 pid=3264 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:08.714181 kernel: audit: type=1300 audit(1765889648.696:532): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffde748bb20 a2=0 a3=7ffde748bb0c items=0 ppid=2949 pid=3264 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:08.696000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:08.720182 kernel: audit: type=1327 audit(1765889648.696:532): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:08.703000 audit[3264]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3264 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:08.725241 kernel: audit: type=1325 audit(1765889648.703:533): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3264 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:08.703000 audit[3264]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffde748bb20 a2=0 a3=0 items=0 ppid=2949 pid=3264 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:08.733218 kernel: audit: type=1300 audit(1765889648.703:533): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffde748bb20 a2=0 a3=0 items=0 ppid=2949 pid=3264 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:08.703000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:08.739199 kernel: audit: type=1327 audit(1765889648.703:533): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:08.719000 audit[3266]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3266 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:08.745202 kernel: audit: type=1325 audit(1765889648.719:534): table=filter:107 family=2 entries=16 op=nft_register_rule pid=3266 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:08.719000 audit[3266]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc90ba8cd0 a2=0 a3=7ffc90ba8cbc items=0 ppid=2949 pid=3266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:08.753178 kernel: audit: type=1300 audit(1765889648.719:534): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc90ba8cd0 a2=0 a3=7ffc90ba8cbc items=0 ppid=2949 pid=3266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:08.719000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:08.761207 kernel: audit: type=1327 audit(1765889648.719:534): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:08.733000 audit[3266]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3266 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:08.767208 kernel: audit: type=1325 audit(1765889648.733:535): table=nat:108 family=2 entries=12 op=nft_register_rule pid=3266 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:08.733000 audit[3266]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc90ba8cd0 a2=0 a3=0 items=0 ppid=2949 pid=3266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:08.733000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:12.096000 audit[3268]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3268 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:12.096000 audit[3268]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7fffa399d590 a2=0 a3=7fffa399d57c items=0 ppid=2949 pid=3268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:12.096000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:12.103000 audit[3268]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3268 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:12.103000 audit[3268]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffa399d590 a2=0 a3=0 items=0 ppid=2949 pid=3268 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:12.103000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:12.139000 audit[3270]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3270 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:12.139000 audit[3270]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffc90b895b0 a2=0 a3=7ffc90b8959c items=0 ppid=2949 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:12.139000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:12.155000 audit[3270]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3270 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:12.155000 audit[3270]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc90b895b0 a2=0 a3=0 items=0 ppid=2949 pid=3270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:12.155000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:13.168000 audit[3272]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3272 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:13.168000 audit[3272]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffe7fd3f0b0 a2=0 a3=7ffe7fd3f09c items=0 ppid=2949 pid=3272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:13.168000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:13.176000 audit[3272]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3272 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:13.176000 audit[3272]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe7fd3f0b0 a2=0 a3=0 items=0 ppid=2949 pid=3272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:13.176000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:13.921034 systemd[1]: Created slice kubepods-besteffort-pod0b116b65_15c4_4094_9ead_ff0cbd044d89.slice - libcontainer container kubepods-besteffort-pod0b116b65_15c4_4094_9ead_ff0cbd044d89.slice. Dec 16 12:54:14.043207 kubelet[2807]: I1216 12:54:14.043077 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b116b65-15c4-4094-9ead-ff0cbd044d89-tigera-ca-bundle\") pod \"calico-typha-57c547476b-9m68n\" (UID: \"0b116b65-15c4-4094-9ead-ff0cbd044d89\") " pod="calico-system/calico-typha-57c547476b-9m68n" Dec 16 12:54:14.043682 kubelet[2807]: I1216 12:54:14.043310 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0b116b65-15c4-4094-9ead-ff0cbd044d89-typha-certs\") pod \"calico-typha-57c547476b-9m68n\" (UID: \"0b116b65-15c4-4094-9ead-ff0cbd044d89\") " pod="calico-system/calico-typha-57c547476b-9m68n" Dec 16 12:54:14.043682 kubelet[2807]: I1216 12:54:14.043412 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7fs\" (UniqueName: \"kubernetes.io/projected/0b116b65-15c4-4094-9ead-ff0cbd044d89-kube-api-access-8b7fs\") pod \"calico-typha-57c547476b-9m68n\" (UID: \"0b116b65-15c4-4094-9ead-ff0cbd044d89\") " pod="calico-system/calico-typha-57c547476b-9m68n" Dec 16 12:54:14.145511 kubelet[2807]: I1216 12:54:14.144041 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d872bdc8-ad14-44d7-8acd-0761965e54a7-flexvol-driver-host\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.145511 kubelet[2807]: I1216 12:54:14.145310 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d872bdc8-ad14-44d7-8acd-0761965e54a7-lib-modules\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.145511 kubelet[2807]: I1216 12:54:14.145347 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvcdr\" (UniqueName: \"kubernetes.io/projected/d872bdc8-ad14-44d7-8acd-0761965e54a7-kube-api-access-cvcdr\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.145511 kubelet[2807]: I1216 12:54:14.145381 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d872bdc8-ad14-44d7-8acd-0761965e54a7-tigera-ca-bundle\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.145511 kubelet[2807]: I1216 12:54:14.145407 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d872bdc8-ad14-44d7-8acd-0761965e54a7-var-lib-calico\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.145803 kubelet[2807]: I1216 12:54:14.145457 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d872bdc8-ad14-44d7-8acd-0761965e54a7-policysync\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.147432 kubelet[2807]: I1216 12:54:14.145484 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d872bdc8-ad14-44d7-8acd-0761965e54a7-var-run-calico\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.147432 kubelet[2807]: I1216 12:54:14.146057 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d872bdc8-ad14-44d7-8acd-0761965e54a7-node-certs\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.147432 kubelet[2807]: I1216 12:54:14.146076 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d872bdc8-ad14-44d7-8acd-0761965e54a7-cni-net-dir\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.147432 kubelet[2807]: I1216 12:54:14.146098 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d872bdc8-ad14-44d7-8acd-0761965e54a7-cni-bin-dir\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.147432 kubelet[2807]: I1216 12:54:14.146122 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d872bdc8-ad14-44d7-8acd-0761965e54a7-cni-log-dir\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.147646 kubelet[2807]: I1216 12:54:14.146135 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d872bdc8-ad14-44d7-8acd-0761965e54a7-xtables-lock\") pod \"calico-node-ksf25\" (UID: \"d872bdc8-ad14-44d7-8acd-0761965e54a7\") " pod="calico-system/calico-node-ksf25" Dec 16 12:54:14.154805 systemd[1]: Created slice kubepods-besteffort-podd872bdc8_ad14_44d7_8acd_0761965e54a7.slice - libcontainer container kubepods-besteffort-podd872bdc8_ad14_44d7_8acd_0761965e54a7.slice. Dec 16 12:54:14.196000 audit[3276]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3276 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:14.199954 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:54:14.200527 kernel: audit: type=1325 audit(1765889654.196:542): table=filter:115 family=2 entries=21 op=nft_register_rule pid=3276 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:14.196000 audit[3276]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffcb01790f0 a2=0 a3=7ffcb01790dc items=0 ppid=2949 pid=3276 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.196000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:14.212654 kernel: audit: type=1300 audit(1765889654.196:542): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffcb01790f0 a2=0 a3=7ffcb01790dc items=0 ppid=2949 pid=3276 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.212704 kernel: audit: type=1327 audit(1765889654.196:542): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:14.204000 audit[3276]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3276 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:14.216518 kernel: audit: type=1325 audit(1765889654.204:543): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3276 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:14.220222 kernel: audit: type=1300 audit(1765889654.204:543): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcb01790f0 a2=0 a3=0 items=0 ppid=2949 pid=3276 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.204000 audit[3276]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcb01790f0 a2=0 a3=0 items=0 ppid=2949 pid=3276 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.227816 containerd[1649]: time="2025-12-16T12:54:14.227752792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57c547476b-9m68n,Uid:0b116b65-15c4-4094-9ead-ff0cbd044d89,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:14.204000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:14.229873 kernel: audit: type=1327 audit(1765889654.204:543): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:14.259000 kubelet[2807]: E1216 12:54:14.258937 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.259000 kubelet[2807]: W1216 12:54:14.258954 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.259753 kubelet[2807]: E1216 12:54:14.259724 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.270919 kubelet[2807]: E1216 12:54:14.270878 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.270919 kubelet[2807]: W1216 12:54:14.270903 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.270919 kubelet[2807]: E1216 12:54:14.270922 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.320570 containerd[1649]: time="2025-12-16T12:54:14.320504648Z" level=info msg="connecting to shim f27f48b43c866f61c4051b24f5ab8b67b7c539c5da23ff109c5fe6b13ce40441" address="unix:///run/containerd/s/6f2906e2a4aba124d68c6a7068cd5ea2d28623eaeb9dd75fa7ef74e44da350cc" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:14.342235 kubelet[2807]: E1216 12:54:14.342038 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:54:14.345045 kubelet[2807]: E1216 12:54:14.345030 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.346240 kubelet[2807]: W1216 12:54:14.345355 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.346240 kubelet[2807]: E1216 12:54:14.345375 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.346894 kubelet[2807]: E1216 12:54:14.346815 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.346894 kubelet[2807]: W1216 12:54:14.346829 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.346894 kubelet[2807]: E1216 12:54:14.346844 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.347280 kubelet[2807]: E1216 12:54:14.347235 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.347280 kubelet[2807]: W1216 12:54:14.347244 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.347280 kubelet[2807]: E1216 12:54:14.347253 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.349224 kubelet[2807]: E1216 12:54:14.349199 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.349224 kubelet[2807]: W1216 12:54:14.349215 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.349224 kubelet[2807]: E1216 12:54:14.349225 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.350128 kubelet[2807]: E1216 12:54:14.350091 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.350128 kubelet[2807]: W1216 12:54:14.350106 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.350913 kubelet[2807]: E1216 12:54:14.350890 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.351616 kubelet[2807]: E1216 12:54:14.351596 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.351616 kubelet[2807]: W1216 12:54:14.351609 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.351616 kubelet[2807]: E1216 12:54:14.351618 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.352258 kubelet[2807]: E1216 12:54:14.352132 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.352258 kubelet[2807]: W1216 12:54:14.352254 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.352332 kubelet[2807]: E1216 12:54:14.352266 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.353040 kubelet[2807]: E1216 12:54:14.353016 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.353314 kubelet[2807]: W1216 12:54:14.353291 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.353314 kubelet[2807]: E1216 12:54:14.353309 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.353729 kubelet[2807]: E1216 12:54:14.353706 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.353729 kubelet[2807]: W1216 12:54:14.353721 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.353729 kubelet[2807]: E1216 12:54:14.353729 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.354327 systemd[1]: Started cri-containerd-f27f48b43c866f61c4051b24f5ab8b67b7c539c5da23ff109c5fe6b13ce40441.scope - libcontainer container f27f48b43c866f61c4051b24f5ab8b67b7c539c5da23ff109c5fe6b13ce40441. Dec 16 12:54:14.355096 kubelet[2807]: E1216 12:54:14.355069 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.355096 kubelet[2807]: W1216 12:54:14.355086 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.355096 kubelet[2807]: E1216 12:54:14.355094 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.357727 kubelet[2807]: E1216 12:54:14.357702 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.357727 kubelet[2807]: W1216 12:54:14.357718 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.357727 kubelet[2807]: E1216 12:54:14.357726 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.357861 kubelet[2807]: E1216 12:54:14.357841 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.357861 kubelet[2807]: W1216 12:54:14.357855 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.357861 kubelet[2807]: E1216 12:54:14.357861 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.358217 kubelet[2807]: E1216 12:54:14.358197 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.358217 kubelet[2807]: W1216 12:54:14.358210 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.358217 kubelet[2807]: E1216 12:54:14.358218 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.358535 kubelet[2807]: E1216 12:54:14.358514 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.358535 kubelet[2807]: W1216 12:54:14.358530 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.358535 kubelet[2807]: E1216 12:54:14.358538 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.358794 kubelet[2807]: E1216 12:54:14.358694 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.358794 kubelet[2807]: W1216 12:54:14.358702 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.358794 kubelet[2807]: E1216 12:54:14.358709 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.360176 kubelet[2807]: E1216 12:54:14.358993 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.360176 kubelet[2807]: W1216 12:54:14.359004 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.360176 kubelet[2807]: E1216 12:54:14.359011 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.360176 kubelet[2807]: E1216 12:54:14.359522 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.360176 kubelet[2807]: W1216 12:54:14.359552 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.360176 kubelet[2807]: E1216 12:54:14.359561 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.360176 kubelet[2807]: E1216 12:54:14.359724 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.360176 kubelet[2807]: W1216 12:54:14.359732 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.360176 kubelet[2807]: E1216 12:54:14.359739 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.360176 kubelet[2807]: E1216 12:54:14.359859 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.360366 kubelet[2807]: W1216 12:54:14.359866 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.360366 kubelet[2807]: E1216 12:54:14.359873 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.360366 kubelet[2807]: E1216 12:54:14.360015 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.360366 kubelet[2807]: W1216 12:54:14.360022 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.360366 kubelet[2807]: E1216 12:54:14.360029 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.360366 kubelet[2807]: E1216 12:54:14.360332 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.360366 kubelet[2807]: W1216 12:54:14.360341 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.360366 kubelet[2807]: E1216 12:54:14.360350 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.360495 kubelet[2807]: I1216 12:54:14.360422 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b5e01eba-2e7b-44aa-9650-696a129f0a90-registration-dir\") pod \"csi-node-driver-xdkpf\" (UID: \"b5e01eba-2e7b-44aa-9650-696a129f0a90\") " pod="calico-system/csi-node-driver-xdkpf" Dec 16 12:54:14.360634 kubelet[2807]: E1216 12:54:14.360593 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.360634 kubelet[2807]: W1216 12:54:14.360608 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.360634 kubelet[2807]: E1216 12:54:14.360621 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.360807 kubelet[2807]: I1216 12:54:14.360784 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b5e01eba-2e7b-44aa-9650-696a129f0a90-socket-dir\") pod \"csi-node-driver-xdkpf\" (UID: \"b5e01eba-2e7b-44aa-9650-696a129f0a90\") " pod="calico-system/csi-node-driver-xdkpf" Dec 16 12:54:14.360845 kubelet[2807]: E1216 12:54:14.360755 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.360845 kubelet[2807]: W1216 12:54:14.360827 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.360845 kubelet[2807]: E1216 12:54:14.360835 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.361018 kubelet[2807]: E1216 12:54:14.361003 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.361214 kubelet[2807]: W1216 12:54:14.361077 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.361214 kubelet[2807]: E1216 12:54:14.361093 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.361450 kubelet[2807]: E1216 12:54:14.361432 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.361450 kubelet[2807]: W1216 12:54:14.361441 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.361695 kubelet[2807]: E1216 12:54:14.361516 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.361801 kubelet[2807]: E1216 12:54:14.361783 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.361801 kubelet[2807]: W1216 12:54:14.361791 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.361946 kubelet[2807]: E1216 12:54:14.361847 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.362175 kubelet[2807]: E1216 12:54:14.362108 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.362175 kubelet[2807]: W1216 12:54:14.362129 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.362175 kubelet[2807]: E1216 12:54:14.362140 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.362350 kubelet[2807]: I1216 12:54:14.362285 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5e01eba-2e7b-44aa-9650-696a129f0a90-kubelet-dir\") pod \"csi-node-driver-xdkpf\" (UID: \"b5e01eba-2e7b-44aa-9650-696a129f0a90\") " pod="calico-system/csi-node-driver-xdkpf" Dec 16 12:54:14.362541 kubelet[2807]: E1216 12:54:14.362520 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.362541 kubelet[2807]: W1216 12:54:14.362531 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.362705 kubelet[2807]: E1216 12:54:14.362603 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.362705 kubelet[2807]: I1216 12:54:14.362620 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn6d5\" (UniqueName: \"kubernetes.io/projected/b5e01eba-2e7b-44aa-9650-696a129f0a90-kube-api-access-jn6d5\") pod \"csi-node-driver-xdkpf\" (UID: \"b5e01eba-2e7b-44aa-9650-696a129f0a90\") " pod="calico-system/csi-node-driver-xdkpf" Dec 16 12:54:14.362979 kubelet[2807]: E1216 12:54:14.362960 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.362979 kubelet[2807]: W1216 12:54:14.362969 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.363265 kubelet[2807]: E1216 12:54:14.363027 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.363265 kubelet[2807]: I1216 12:54:14.363042 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b5e01eba-2e7b-44aa-9650-696a129f0a90-varrun\") pod \"csi-node-driver-xdkpf\" (UID: \"b5e01eba-2e7b-44aa-9650-696a129f0a90\") " pod="calico-system/csi-node-driver-xdkpf" Dec 16 12:54:14.363481 kubelet[2807]: E1216 12:54:14.363472 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.363481 kubelet[2807]: W1216 12:54:14.363505 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.363686 kubelet[2807]: E1216 12:54:14.363667 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.363768 kubelet[2807]: E1216 12:54:14.363760 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.363834 kubelet[2807]: W1216 12:54:14.363798 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.363960 kubelet[2807]: E1216 12:54:14.363883 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.364070 kubelet[2807]: E1216 12:54:14.364053 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.364070 kubelet[2807]: W1216 12:54:14.364060 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.364208 kubelet[2807]: E1216 12:54:14.364128 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.364389 kubelet[2807]: E1216 12:54:14.364364 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.364389 kubelet[2807]: W1216 12:54:14.364373 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.364389 kubelet[2807]: E1216 12:54:14.364380 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.364685 kubelet[2807]: E1216 12:54:14.364641 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.364685 kubelet[2807]: W1216 12:54:14.364668 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.364685 kubelet[2807]: E1216 12:54:14.364675 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.365695 kubelet[2807]: E1216 12:54:14.365684 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.365793 kubelet[2807]: W1216 12:54:14.365783 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.365858 kubelet[2807]: E1216 12:54:14.365841 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.374000 audit: BPF prog-id=156 op=LOAD Dec 16 12:54:14.379206 kernel: audit: type=1334 audit(1765889654.374:544): prog-id=156 op=LOAD Dec 16 12:54:14.379259 kernel: audit: type=1334 audit(1765889654.375:545): prog-id=157 op=LOAD Dec 16 12:54:14.375000 audit: BPF prog-id=157 op=LOAD Dec 16 12:54:14.389641 kernel: audit: type=1300 audit(1765889654.375:545): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=3290 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.375000 audit[3301]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=3290 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632376634386234336338363666363163343035316232346635616238 Dec 16 12:54:14.375000 audit: BPF prog-id=157 op=UNLOAD Dec 16 12:54:14.375000 audit[3301]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3290 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.397242 kernel: audit: type=1327 audit(1765889654.375:545): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632376634386234336338363666363163343035316232346635616238 Dec 16 12:54:14.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632376634386234336338363666363163343035316232346635616238 Dec 16 12:54:14.375000 audit: BPF prog-id=158 op=LOAD Dec 16 12:54:14.375000 audit[3301]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=3290 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632376634386234336338363666363163343035316232346635616238 Dec 16 12:54:14.375000 audit: BPF prog-id=159 op=LOAD Dec 16 12:54:14.375000 audit[3301]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=3290 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632376634386234336338363666363163343035316232346635616238 Dec 16 12:54:14.375000 audit: BPF prog-id=159 op=UNLOAD Dec 16 12:54:14.375000 audit[3301]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3290 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632376634386234336338363666363163343035316232346635616238 Dec 16 12:54:14.375000 audit: BPF prog-id=158 op=UNLOAD Dec 16 12:54:14.375000 audit[3301]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3290 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632376634386234336338363666363163343035316232346635616238 Dec 16 12:54:14.375000 audit: BPF prog-id=160 op=LOAD Dec 16 12:54:14.375000 audit[3301]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=3290 pid=3301 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632376634386234336338363666363163343035316232346635616238 Dec 16 12:54:14.416476 containerd[1649]: time="2025-12-16T12:54:14.416424877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57c547476b-9m68n,Uid:0b116b65-15c4-4094-9ead-ff0cbd044d89,Namespace:calico-system,Attempt:0,} returns sandbox id \"f27f48b43c866f61c4051b24f5ab8b67b7c539c5da23ff109c5fe6b13ce40441\"" Dec 16 12:54:14.418583 containerd[1649]: time="2025-12-16T12:54:14.418552522Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 12:54:14.463913 kubelet[2807]: E1216 12:54:14.463789 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.463913 kubelet[2807]: W1216 12:54:14.463811 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.463913 kubelet[2807]: E1216 12:54:14.463830 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.464817 kubelet[2807]: E1216 12:54:14.464757 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.464817 kubelet[2807]: W1216 12:54:14.464776 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.464817 kubelet[2807]: E1216 12:54:14.464795 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.465655 kubelet[2807]: E1216 12:54:14.465227 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.465655 kubelet[2807]: W1216 12:54:14.465241 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.465655 kubelet[2807]: E1216 12:54:14.465286 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.466094 kubelet[2807]: E1216 12:54:14.465791 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.466094 kubelet[2807]: W1216 12:54:14.465803 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.466094 kubelet[2807]: E1216 12:54:14.465906 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.466490 kubelet[2807]: E1216 12:54:14.466440 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.466490 kubelet[2807]: W1216 12:54:14.466452 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.466708 kubelet[2807]: E1216 12:54:14.466687 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.466929 kubelet[2807]: E1216 12:54:14.466851 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.466929 kubelet[2807]: W1216 12:54:14.466859 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.467474 kubelet[2807]: E1216 12:54:14.467189 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.467474 kubelet[2807]: E1216 12:54:14.467390 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.467474 kubelet[2807]: W1216 12:54:14.467399 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.467816 kubelet[2807]: E1216 12:54:14.467669 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.468082 kubelet[2807]: E1216 12:54:14.468062 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.468082 kubelet[2807]: W1216 12:54:14.468076 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.468623 kubelet[2807]: E1216 12:54:14.468308 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.469242 kubelet[2807]: E1216 12:54:14.469223 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.469242 kubelet[2807]: W1216 12:54:14.469238 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.469345 kubelet[2807]: E1216 12:54:14.469317 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.469583 kubelet[2807]: E1216 12:54:14.469462 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.469583 kubelet[2807]: W1216 12:54:14.469571 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.470508 kubelet[2807]: E1216 12:54:14.470488 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.470753 kubelet[2807]: E1216 12:54:14.470709 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.470753 kubelet[2807]: W1216 12:54:14.470751 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.470918 kubelet[2807]: E1216 12:54:14.470828 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.471069 kubelet[2807]: E1216 12:54:14.471000 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.471069 kubelet[2807]: W1216 12:54:14.471015 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.471138 kubelet[2807]: E1216 12:54:14.471083 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.471503 kubelet[2807]: E1216 12:54:14.471215 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.471503 kubelet[2807]: W1216 12:54:14.471223 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.471503 kubelet[2807]: E1216 12:54:14.471342 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.471503 kubelet[2807]: W1216 12:54:14.471348 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.471503 kubelet[2807]: E1216 12:54:14.471454 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.471503 kubelet[2807]: E1216 12:54:14.471475 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.471503 kubelet[2807]: E1216 12:54:14.471482 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.471503 kubelet[2807]: W1216 12:54:14.471491 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.471654 kubelet[2807]: E1216 12:54:14.471524 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.471887 kubelet[2807]: E1216 12:54:14.471833 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.471887 kubelet[2807]: W1216 12:54:14.471841 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.471887 kubelet[2807]: E1216 12:54:14.471857 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.472305 kubelet[2807]: E1216 12:54:14.472012 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.472305 kubelet[2807]: W1216 12:54:14.472020 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.472305 kubelet[2807]: E1216 12:54:14.472029 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.472305 kubelet[2807]: E1216 12:54:14.472230 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.472305 kubelet[2807]: W1216 12:54:14.472238 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.472466 kubelet[2807]: E1216 12:54:14.472361 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.472633 kubelet[2807]: E1216 12:54:14.472505 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.472633 kubelet[2807]: W1216 12:54:14.472629 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.472861 kubelet[2807]: E1216 12:54:14.472838 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.472861 kubelet[2807]: W1216 12:54:14.472853 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.472909 kubelet[2807]: E1216 12:54:14.472902 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.472928 kubelet[2807]: E1216 12:54:14.472922 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.473319 kubelet[2807]: E1216 12:54:14.473294 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.473319 kubelet[2807]: W1216 12:54:14.473314 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.473752 kubelet[2807]: E1216 12:54:14.473732 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.473752 kubelet[2807]: W1216 12:54:14.473750 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.473839 kubelet[2807]: E1216 12:54:14.473784 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.473950 kubelet[2807]: E1216 12:54:14.473923 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.474101 kubelet[2807]: E1216 12:54:14.474079 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.474101 kubelet[2807]: W1216 12:54:14.474093 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.474101 kubelet[2807]: E1216 12:54:14.474102 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.474718 containerd[1649]: time="2025-12-16T12:54:14.474677142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ksf25,Uid:d872bdc8-ad14-44d7-8acd-0761965e54a7,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:14.474892 kubelet[2807]: E1216 12:54:14.474867 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.474892 kubelet[2807]: W1216 12:54:14.474878 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.475014 kubelet[2807]: E1216 12:54:14.474892 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.475391 kubelet[2807]: E1216 12:54:14.475366 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.475391 kubelet[2807]: W1216 12:54:14.475385 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.475455 kubelet[2807]: E1216 12:54:14.475399 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.485897 kubelet[2807]: E1216 12:54:14.485868 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:14.485897 kubelet[2807]: W1216 12:54:14.485893 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:14.486241 kubelet[2807]: E1216 12:54:14.485911 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:14.523677 containerd[1649]: time="2025-12-16T12:54:14.523628693Z" level=info msg="connecting to shim 8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed" address="unix:///run/containerd/s/467cd46aeeef9b6d55a02d4c2c858856d3e81ce155b1f4b821fde121dbde8775" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:14.546317 systemd[1]: Started cri-containerd-8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed.scope - libcontainer container 8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed. Dec 16 12:54:14.557000 audit: BPF prog-id=161 op=LOAD Dec 16 12:54:14.558000 audit: BPF prog-id=162 op=LOAD Dec 16 12:54:14.558000 audit[3419]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=3408 pid=3419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.558000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836393663383538656361386234383865323432383639303032616336 Dec 16 12:54:14.558000 audit: BPF prog-id=162 op=UNLOAD Dec 16 12:54:14.558000 audit[3419]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.558000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836393663383538656361386234383865323432383639303032616336 Dec 16 12:54:14.558000 audit: BPF prog-id=163 op=LOAD Dec 16 12:54:14.558000 audit[3419]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3408 pid=3419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.558000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836393663383538656361386234383865323432383639303032616336 Dec 16 12:54:14.558000 audit: BPF prog-id=164 op=LOAD Dec 16 12:54:14.558000 audit[3419]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3408 pid=3419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.558000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836393663383538656361386234383865323432383639303032616336 Dec 16 12:54:14.558000 audit: BPF prog-id=164 op=UNLOAD Dec 16 12:54:14.558000 audit[3419]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.558000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836393663383538656361386234383865323432383639303032616336 Dec 16 12:54:14.558000 audit: BPF prog-id=163 op=UNLOAD Dec 16 12:54:14.558000 audit[3419]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.558000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836393663383538656361386234383865323432383639303032616336 Dec 16 12:54:14.558000 audit: BPF prog-id=165 op=LOAD Dec 16 12:54:14.558000 audit[3419]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=3408 pid=3419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:14.558000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836393663383538656361386234383865323432383639303032616336 Dec 16 12:54:14.579567 containerd[1649]: time="2025-12-16T12:54:14.579527077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ksf25,Uid:d872bdc8-ad14-44d7-8acd-0761965e54a7,Namespace:calico-system,Attempt:0,} returns sandbox id \"8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed\"" Dec 16 12:54:16.105387 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2164274393.mount: Deactivated successfully. Dec 16 12:54:16.317542 kubelet[2807]: E1216 12:54:16.317498 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:54:16.500702 containerd[1649]: time="2025-12-16T12:54:16.500270539Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:16.502674 containerd[1649]: time="2025-12-16T12:54:16.502643734Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.084059723s" Dec 16 12:54:16.502746 containerd[1649]: time="2025-12-16T12:54:16.502733643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 16 12:54:16.505416 containerd[1649]: time="2025-12-16T12:54:16.504764071Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 12:54:16.515245 containerd[1649]: time="2025-12-16T12:54:16.515202679Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:16.518129 containerd[1649]: time="2025-12-16T12:54:16.518077552Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:16.519224 containerd[1649]: time="2025-12-16T12:54:16.519185997Z" level=info msg="CreateContainer within sandbox \"f27f48b43c866f61c4051b24f5ab8b67b7c539c5da23ff109c5fe6b13ce40441\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 12:54:16.520457 containerd[1649]: time="2025-12-16T12:54:16.520422634Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:16.527335 containerd[1649]: time="2025-12-16T12:54:16.526413866Z" level=info msg="Container 5e7fdc2b682e7aceae334c0a27d56b91cde0587018de4e650c6b3b69a1140408: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:54:16.529623 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4111172439.mount: Deactivated successfully. Dec 16 12:54:16.556657 containerd[1649]: time="2025-12-16T12:54:16.556590217Z" level=info msg="CreateContainer within sandbox \"f27f48b43c866f61c4051b24f5ab8b67b7c539c5da23ff109c5fe6b13ce40441\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5e7fdc2b682e7aceae334c0a27d56b91cde0587018de4e650c6b3b69a1140408\"" Dec 16 12:54:16.557234 containerd[1649]: time="2025-12-16T12:54:16.557121842Z" level=info msg="StartContainer for \"5e7fdc2b682e7aceae334c0a27d56b91cde0587018de4e650c6b3b69a1140408\"" Dec 16 12:54:16.558705 containerd[1649]: time="2025-12-16T12:54:16.558668474Z" level=info msg="connecting to shim 5e7fdc2b682e7aceae334c0a27d56b91cde0587018de4e650c6b3b69a1140408" address="unix:///run/containerd/s/6f2906e2a4aba124d68c6a7068cd5ea2d28623eaeb9dd75fa7ef74e44da350cc" protocol=ttrpc version=3 Dec 16 12:54:16.602333 systemd[1]: Started cri-containerd-5e7fdc2b682e7aceae334c0a27d56b91cde0587018de4e650c6b3b69a1140408.scope - libcontainer container 5e7fdc2b682e7aceae334c0a27d56b91cde0587018de4e650c6b3b69a1140408. Dec 16 12:54:16.615000 audit: BPF prog-id=166 op=LOAD Dec 16 12:54:16.615000 audit: BPF prog-id=167 op=LOAD Dec 16 12:54:16.615000 audit[3452]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=3290 pid=3452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:16.615000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565376664633262363832653761636561653333346330613237643536 Dec 16 12:54:16.615000 audit: BPF prog-id=167 op=UNLOAD Dec 16 12:54:16.615000 audit[3452]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3290 pid=3452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:16.615000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565376664633262363832653761636561653333346330613237643536 Dec 16 12:54:16.615000 audit: BPF prog-id=168 op=LOAD Dec 16 12:54:16.615000 audit[3452]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=3290 pid=3452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:16.615000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565376664633262363832653761636561653333346330613237643536 Dec 16 12:54:16.616000 audit: BPF prog-id=169 op=LOAD Dec 16 12:54:16.616000 audit[3452]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=3290 pid=3452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:16.616000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565376664633262363832653761636561653333346330613237643536 Dec 16 12:54:16.616000 audit: BPF prog-id=169 op=UNLOAD Dec 16 12:54:16.616000 audit[3452]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3290 pid=3452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:16.616000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565376664633262363832653761636561653333346330613237643536 Dec 16 12:54:16.616000 audit: BPF prog-id=168 op=UNLOAD Dec 16 12:54:16.616000 audit[3452]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3290 pid=3452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:16.616000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565376664633262363832653761636561653333346330613237643536 Dec 16 12:54:16.616000 audit: BPF prog-id=170 op=LOAD Dec 16 12:54:16.616000 audit[3452]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=3290 pid=3452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:16.616000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565376664633262363832653761636561653333346330613237643536 Dec 16 12:54:16.655433 containerd[1649]: time="2025-12-16T12:54:16.655395028Z" level=info msg="StartContainer for \"5e7fdc2b682e7aceae334c0a27d56b91cde0587018de4e650c6b3b69a1140408\" returns successfully" Dec 16 12:54:17.472486 kubelet[2807]: I1216 12:54:17.472391 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-57c547476b-9m68n" podStartSLOduration=2.372049701 podStartE2EDuration="4.458002303s" podCreationTimestamp="2025-12-16 12:54:13 +0000 UTC" firstStartedPulling="2025-12-16 12:54:14.417757709 +0000 UTC m=+22.203411711" lastFinishedPulling="2025-12-16 12:54:16.5037103 +0000 UTC m=+24.289364313" observedRunningTime="2025-12-16 12:54:17.452623012 +0000 UTC m=+25.238277025" watchObservedRunningTime="2025-12-16 12:54:17.458002303 +0000 UTC m=+25.243656336" Dec 16 12:54:17.486162 kubelet[2807]: E1216 12:54:17.485962 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.486162 kubelet[2807]: W1216 12:54:17.486002 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.486162 kubelet[2807]: E1216 12:54:17.486032 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.486625 kubelet[2807]: E1216 12:54:17.486608 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.486773 kubelet[2807]: W1216 12:54:17.486698 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.486773 kubelet[2807]: E1216 12:54:17.486718 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.487197 kubelet[2807]: E1216 12:54:17.487096 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.487197 kubelet[2807]: W1216 12:54:17.487111 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.487197 kubelet[2807]: E1216 12:54:17.487125 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.487642 kubelet[2807]: E1216 12:54:17.487556 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.487642 kubelet[2807]: W1216 12:54:17.487573 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.487642 kubelet[2807]: E1216 12:54:17.487585 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.487957 kubelet[2807]: E1216 12:54:17.487893 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.487957 kubelet[2807]: W1216 12:54:17.487906 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.487957 kubelet[2807]: E1216 12:54:17.487918 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.488456 kubelet[2807]: E1216 12:54:17.488374 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.488456 kubelet[2807]: W1216 12:54:17.488400 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.488456 kubelet[2807]: E1216 12:54:17.488413 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.488857 kubelet[2807]: E1216 12:54:17.488831 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.489061 kubelet[2807]: W1216 12:54:17.488962 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.489061 kubelet[2807]: E1216 12:54:17.488991 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.489538 kubelet[2807]: E1216 12:54:17.489439 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.489538 kubelet[2807]: W1216 12:54:17.489456 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.489538 kubelet[2807]: E1216 12:54:17.489472 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.489900 kubelet[2807]: E1216 12:54:17.489882 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.490040 kubelet[2807]: W1216 12:54:17.489980 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.490040 kubelet[2807]: E1216 12:54:17.489998 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.490418 kubelet[2807]: E1216 12:54:17.490349 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.490418 kubelet[2807]: W1216 12:54:17.490364 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.490418 kubelet[2807]: E1216 12:54:17.490377 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.490786 kubelet[2807]: E1216 12:54:17.490716 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.490786 kubelet[2807]: W1216 12:54:17.490729 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.490786 kubelet[2807]: E1216 12:54:17.490741 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.491322 kubelet[2807]: E1216 12:54:17.491206 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.491322 kubelet[2807]: W1216 12:54:17.491236 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.491322 kubelet[2807]: E1216 12:54:17.491252 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.492083 kubelet[2807]: E1216 12:54:17.492059 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.492321 kubelet[2807]: W1216 12:54:17.492240 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.492321 kubelet[2807]: E1216 12:54:17.492270 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.492680 kubelet[2807]: E1216 12:54:17.492590 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.492680 kubelet[2807]: W1216 12:54:17.492618 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.492680 kubelet[2807]: E1216 12:54:17.492631 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.493032 kubelet[2807]: E1216 12:54:17.492974 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.493032 kubelet[2807]: W1216 12:54:17.492988 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.493032 kubelet[2807]: E1216 12:54:17.493011 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.493744 kubelet[2807]: E1216 12:54:17.493693 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.493744 kubelet[2807]: W1216 12:54:17.493710 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.493744 kubelet[2807]: E1216 12:54:17.493723 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.494330 kubelet[2807]: E1216 12:54:17.494298 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.494330 kubelet[2807]: W1216 12:54:17.494313 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.494563 kubelet[2807]: E1216 12:54:17.494465 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.494811 kubelet[2807]: E1216 12:54:17.494792 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.494920 kubelet[2807]: W1216 12:54:17.494882 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.495278 kubelet[2807]: E1216 12:54:17.495259 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.495384 kubelet[2807]: W1216 12:54:17.495367 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.495586 kubelet[2807]: E1216 12:54:17.495449 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.495811 kubelet[2807]: E1216 12:54:17.495790 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.496031 kubelet[2807]: W1216 12:54:17.495905 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.496031 kubelet[2807]: E1216 12:54:17.495928 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.496510 kubelet[2807]: E1216 12:54:17.496342 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.496510 kubelet[2807]: W1216 12:54:17.496360 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.496510 kubelet[2807]: E1216 12:54:17.496389 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.496767 kubelet[2807]: E1216 12:54:17.496749 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.497009 kubelet[2807]: W1216 12:54:17.496831 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.497009 kubelet[2807]: E1216 12:54:17.496854 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.497243 kubelet[2807]: E1216 12:54:17.497203 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.497448 kubelet[2807]: E1216 12:54:17.497415 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.497448 kubelet[2807]: W1216 12:54:17.497429 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.497624 kubelet[2807]: E1216 12:54:17.497567 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.497950 kubelet[2807]: E1216 12:54:17.497854 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.497950 kubelet[2807]: W1216 12:54:17.497868 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.497950 kubelet[2807]: E1216 12:54:17.497880 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.498528 kubelet[2807]: E1216 12:54:17.498387 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.498528 kubelet[2807]: W1216 12:54:17.498402 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.498528 kubelet[2807]: E1216 12:54:17.498415 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.498746 kubelet[2807]: E1216 12:54:17.498731 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.498901 kubelet[2807]: W1216 12:54:17.498833 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.499234 kubelet[2807]: E1216 12:54:17.498986 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.499478 kubelet[2807]: E1216 12:54:17.499465 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.499562 kubelet[2807]: W1216 12:54:17.499548 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.499631 kubelet[2807]: E1216 12:54:17.499618 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.499957 kubelet[2807]: E1216 12:54:17.499924 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.499957 kubelet[2807]: W1216 12:54:17.499938 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.500267 kubelet[2807]: E1216 12:54:17.500115 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.500643 kubelet[2807]: E1216 12:54:17.500605 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.500643 kubelet[2807]: W1216 12:54:17.500623 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.500890 kubelet[2807]: E1216 12:54:17.500792 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.501253 kubelet[2807]: E1216 12:54:17.501203 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.501253 kubelet[2807]: W1216 12:54:17.501227 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.501497 kubelet[2807]: E1216 12:54:17.501446 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.501728 kubelet[2807]: E1216 12:54:17.501698 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.501728 kubelet[2807]: W1216 12:54:17.501713 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.501897 kubelet[2807]: E1216 12:54:17.501837 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.502723 kubelet[2807]: E1216 12:54:17.502232 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.502723 kubelet[2807]: W1216 12:54:17.502252 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.502723 kubelet[2807]: E1216 12:54:17.502269 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:17.503033 kubelet[2807]: E1216 12:54:17.502995 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:17.503121 kubelet[2807]: W1216 12:54:17.503106 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:17.503284 kubelet[2807]: E1216 12:54:17.503264 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.313552 kubelet[2807]: E1216 12:54:18.313499 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:54:18.372910 containerd[1649]: time="2025-12-16T12:54:18.372810082Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:18.377003 containerd[1649]: time="2025-12-16T12:54:18.373994137Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:18.377003 containerd[1649]: time="2025-12-16T12:54:18.376276837Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:18.378822 containerd[1649]: time="2025-12-16T12:54:18.378766257Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:18.379535 containerd[1649]: time="2025-12-16T12:54:18.379140083Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.874338632s" Dec 16 12:54:18.379535 containerd[1649]: time="2025-12-16T12:54:18.379210777Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 16 12:54:18.382514 containerd[1649]: time="2025-12-16T12:54:18.382485319Z" level=info msg="CreateContainer within sandbox \"8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 12:54:18.395415 containerd[1649]: time="2025-12-16T12:54:18.395377823Z" level=info msg="Container 7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:54:18.401722 containerd[1649]: time="2025-12-16T12:54:18.401686904Z" level=info msg="CreateContainer within sandbox \"8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49\"" Dec 16 12:54:18.402518 containerd[1649]: time="2025-12-16T12:54:18.402498307Z" level=info msg="StartContainer for \"7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49\"" Dec 16 12:54:18.404346 containerd[1649]: time="2025-12-16T12:54:18.404324214Z" level=info msg="connecting to shim 7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49" address="unix:///run/containerd/s/467cd46aeeef9b6d55a02d4c2c858856d3e81ce155b1f4b821fde121dbde8775" protocol=ttrpc version=3 Dec 16 12:54:18.428384 systemd[1]: Started cri-containerd-7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49.scope - libcontainer container 7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49. Dec 16 12:54:18.445840 kubelet[2807]: I1216 12:54:18.445574 2807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:54:18.477000 audit: BPF prog-id=171 op=LOAD Dec 16 12:54:18.477000 audit[3527]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3408 pid=3527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:18.477000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761616163363930393063613834323366306264386336623933303638 Dec 16 12:54:18.477000 audit: BPF prog-id=172 op=LOAD Dec 16 12:54:18.477000 audit[3527]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3408 pid=3527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:18.477000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761616163363930393063613834323366306264386336623933303638 Dec 16 12:54:18.477000 audit: BPF prog-id=172 op=UNLOAD Dec 16 12:54:18.477000 audit[3527]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:18.477000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761616163363930393063613834323366306264386336623933303638 Dec 16 12:54:18.477000 audit: BPF prog-id=171 op=UNLOAD Dec 16 12:54:18.477000 audit[3527]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:18.477000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761616163363930393063613834323366306264386336623933303638 Dec 16 12:54:18.477000 audit: BPF prog-id=173 op=LOAD Dec 16 12:54:18.477000 audit[3527]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3408 pid=3527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:18.477000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761616163363930393063613834323366306264386336623933303638 Dec 16 12:54:18.499622 containerd[1649]: time="2025-12-16T12:54:18.499549334Z" level=info msg="StartContainer for \"7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49\" returns successfully" Dec 16 12:54:18.501258 kubelet[2807]: E1216 12:54:18.501223 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.501258 kubelet[2807]: W1216 12:54:18.501243 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.501258 kubelet[2807]: E1216 12:54:18.501259 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.502344 kubelet[2807]: E1216 12:54:18.502272 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.502344 kubelet[2807]: W1216 12:54:18.502284 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.502344 kubelet[2807]: E1216 12:54:18.502298 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.502609 kubelet[2807]: E1216 12:54:18.502552 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.502609 kubelet[2807]: W1216 12:54:18.502562 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.502609 kubelet[2807]: E1216 12:54:18.502572 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.502894 kubelet[2807]: E1216 12:54:18.502843 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.502894 kubelet[2807]: W1216 12:54:18.502852 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.502894 kubelet[2807]: E1216 12:54:18.502862 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.503328 kubelet[2807]: E1216 12:54:18.503307 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.503328 kubelet[2807]: W1216 12:54:18.503324 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.503599 kubelet[2807]: E1216 12:54:18.503338 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.504036 kubelet[2807]: E1216 12:54:18.504017 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.504036 kubelet[2807]: W1216 12:54:18.504030 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.504102 kubelet[2807]: E1216 12:54:18.504039 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.504508 kubelet[2807]: E1216 12:54:18.504487 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.504508 kubelet[2807]: W1216 12:54:18.504502 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.504579 kubelet[2807]: E1216 12:54:18.504511 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.505112 kubelet[2807]: E1216 12:54:18.505084 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.505112 kubelet[2807]: W1216 12:54:18.505096 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.505112 kubelet[2807]: E1216 12:54:18.505104 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.506027 kubelet[2807]: E1216 12:54:18.505994 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.506137 kubelet[2807]: W1216 12:54:18.506119 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.506137 kubelet[2807]: E1216 12:54:18.506134 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.507241 kubelet[2807]: E1216 12:54:18.507219 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.507241 kubelet[2807]: W1216 12:54:18.507239 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.507311 kubelet[2807]: E1216 12:54:18.507251 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.507405 kubelet[2807]: E1216 12:54:18.507386 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.507405 kubelet[2807]: W1216 12:54:18.507400 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.507405 kubelet[2807]: E1216 12:54:18.507408 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.507555 kubelet[2807]: E1216 12:54:18.507539 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.507587 kubelet[2807]: W1216 12:54:18.507556 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.507587 kubelet[2807]: E1216 12:54:18.507570 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.507859 kubelet[2807]: E1216 12:54:18.507840 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.507859 kubelet[2807]: W1216 12:54:18.507857 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.507912 kubelet[2807]: E1216 12:54:18.507869 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.508014 kubelet[2807]: E1216 12:54:18.507997 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.508014 kubelet[2807]: W1216 12:54:18.508009 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.508063 kubelet[2807]: E1216 12:54:18.508017 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.508346 kubelet[2807]: E1216 12:54:18.508184 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:54:18.508346 kubelet[2807]: W1216 12:54:18.508211 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:54:18.508346 kubelet[2807]: E1216 12:54:18.508222 2807 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:54:18.511199 systemd[1]: cri-containerd-7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49.scope: Deactivated successfully. Dec 16 12:54:18.512000 audit: BPF prog-id=173 op=UNLOAD Dec 16 12:54:18.515494 containerd[1649]: time="2025-12-16T12:54:18.515415021Z" level=info msg="received container exit event container_id:\"7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49\" id:\"7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49\" pid:3539 exited_at:{seconds:1765889658 nanos:514866365}" Dec 16 12:54:18.535258 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7aaac69090ca8423f0bd8c6b93068e9d77d5faaef06626ed0a780827d4ae5a49-rootfs.mount: Deactivated successfully. Dec 16 12:54:19.457718 containerd[1649]: time="2025-12-16T12:54:19.457643625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 12:54:20.314259 kubelet[2807]: E1216 12:54:20.313807 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:54:22.061175 containerd[1649]: time="2025-12-16T12:54:22.060970110Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Dec 16 12:54:22.064075 containerd[1649]: time="2025-12-16T12:54:22.063969365Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 2.606267882s" Dec 16 12:54:22.064075 containerd[1649]: time="2025-12-16T12:54:22.063994582Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 16 12:54:22.066252 containerd[1649]: time="2025-12-16T12:54:22.066087658Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:22.067674 containerd[1649]: time="2025-12-16T12:54:22.067654573Z" level=info msg="CreateContainer within sandbox \"8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 12:54:22.068292 containerd[1649]: time="2025-12-16T12:54:22.068050038Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:22.069265 containerd[1649]: time="2025-12-16T12:54:22.069207591Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:22.078846 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2912115576.mount: Deactivated successfully. Dec 16 12:54:22.080200 containerd[1649]: time="2025-12-16T12:54:22.080002292Z" level=info msg="Container b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:54:22.088162 containerd[1649]: time="2025-12-16T12:54:22.088121188Z" level=info msg="CreateContainer within sandbox \"8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a\"" Dec 16 12:54:22.088993 containerd[1649]: time="2025-12-16T12:54:22.088962714Z" level=info msg="StartContainer for \"b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a\"" Dec 16 12:54:22.090661 containerd[1649]: time="2025-12-16T12:54:22.090627935Z" level=info msg="connecting to shim b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a" address="unix:///run/containerd/s/467cd46aeeef9b6d55a02d4c2c858856d3e81ce155b1f4b821fde121dbde8775" protocol=ttrpc version=3 Dec 16 12:54:22.114331 systemd[1]: Started cri-containerd-b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a.scope - libcontainer container b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a. Dec 16 12:54:22.151000 audit: BPF prog-id=174 op=LOAD Dec 16 12:54:22.154193 kernel: kauditd_printk_skb: 78 callbacks suppressed Dec 16 12:54:22.154280 kernel: audit: type=1334 audit(1765889662.151:574): prog-id=174 op=LOAD Dec 16 12:54:22.160183 kernel: audit: type=1300 audit(1765889662.151:574): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3408 pid=3598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:22.151000 audit[3598]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3408 pid=3598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:22.174241 kernel: audit: type=1327 audit(1765889662.151:574): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233653832383830303433633730386133356233653661303561626565 Dec 16 12:54:22.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233653832383830303433633730386133356233653661303561626565 Dec 16 12:54:22.155000 audit: BPF prog-id=175 op=LOAD Dec 16 12:54:22.176196 kernel: audit: type=1334 audit(1765889662.155:575): prog-id=175 op=LOAD Dec 16 12:54:22.155000 audit[3598]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3408 pid=3598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:22.180399 kernel: audit: type=1300 audit(1765889662.155:575): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3408 pid=3598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:22.155000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233653832383830303433633730386133356233653661303561626565 Dec 16 12:54:22.188236 kernel: audit: type=1327 audit(1765889662.155:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233653832383830303433633730386133356233653661303561626565 Dec 16 12:54:22.155000 audit: BPF prog-id=175 op=UNLOAD Dec 16 12:54:22.194064 kernel: audit: type=1334 audit(1765889662.155:576): prog-id=175 op=UNLOAD Dec 16 12:54:22.155000 audit[3598]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:22.197550 kernel: audit: type=1300 audit(1765889662.155:576): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:22.155000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233653832383830303433633730386133356233653661303561626565 Dec 16 12:54:22.204784 containerd[1649]: time="2025-12-16T12:54:22.204423570Z" level=info msg="StartContainer for \"b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a\" returns successfully" Dec 16 12:54:22.205526 kernel: audit: type=1327 audit(1765889662.155:576): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233653832383830303433633730386133356233653661303561626565 Dec 16 12:54:22.155000 audit: BPF prog-id=174 op=UNLOAD Dec 16 12:54:22.155000 audit[3598]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:22.155000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233653832383830303433633730386133356233653661303561626565 Dec 16 12:54:22.155000 audit: BPF prog-id=176 op=LOAD Dec 16 12:54:22.155000 audit[3598]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=3408 pid=3598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:22.155000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233653832383830303433633730386133356233653661303561626565 Dec 16 12:54:22.216554 kernel: audit: type=1334 audit(1765889662.155:577): prog-id=174 op=UNLOAD Dec 16 12:54:22.313070 kubelet[2807]: E1216 12:54:22.312961 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:54:22.659590 systemd[1]: cri-containerd-b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a.scope: Deactivated successfully. Dec 16 12:54:22.659838 systemd[1]: cri-containerd-b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a.scope: Consumed 384ms CPU time, 164M memory peak, 6.7M read from disk, 171.3M written to disk. Dec 16 12:54:22.663000 audit: BPF prog-id=176 op=UNLOAD Dec 16 12:54:22.668513 containerd[1649]: time="2025-12-16T12:54:22.668463755Z" level=info msg="received container exit event container_id:\"b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a\" id:\"b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a\" pid:3612 exited_at:{seconds:1765889662 nanos:667459842}" Dec 16 12:54:22.710768 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b3e82880043c708a35b3e6a05abeea81de493e50c9bde0916590d7131db5377a-rootfs.mount: Deactivated successfully. Dec 16 12:54:22.742183 kubelet[2807]: I1216 12:54:22.742139 2807 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 16 12:54:22.797994 systemd[1]: Created slice kubepods-besteffort-pod83624a12_e59b_4753_81b9_815a3846bf01.slice - libcontainer container kubepods-besteffort-pod83624a12_e59b_4753_81b9_815a3846bf01.slice. Dec 16 12:54:22.825432 systemd[1]: Created slice kubepods-besteffort-pod942b94f4_174b_4d9a_b7f3_55c25ca8719b.slice - libcontainer container kubepods-besteffort-pod942b94f4_174b_4d9a_b7f3_55c25ca8719b.slice. Dec 16 12:54:22.837369 kubelet[2807]: I1216 12:54:22.837346 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/83624a12-e59b-4753-81b9-815a3846bf01-goldmane-key-pair\") pod \"goldmane-666569f655-9bvxr\" (UID: \"83624a12-e59b-4753-81b9-815a3846bf01\") " pod="calico-system/goldmane-666569f655-9bvxr" Dec 16 12:54:22.837600 kubelet[2807]: I1216 12:54:22.837450 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb9pl\" (UniqueName: \"kubernetes.io/projected/7255bf38-4b45-44eb-b0b0-8e800109b0ec-kube-api-access-xb9pl\") pod \"coredns-668d6bf9bc-5fjr9\" (UID: \"7255bf38-4b45-44eb-b0b0-8e800109b0ec\") " pod="kube-system/coredns-668d6bf9bc-5fjr9" Dec 16 12:54:22.837823 kubelet[2807]: I1216 12:54:22.837735 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn857\" (UniqueName: \"kubernetes.io/projected/9345e167-5638-4038-a959-3d55222d2d5c-kube-api-access-jn857\") pod \"calico-apiserver-6f5b8bcc75-wg5dd\" (UID: \"9345e167-5638-4038-a959-3d55222d2d5c\") " pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" Dec 16 12:54:22.837823 kubelet[2807]: I1216 12:54:22.837765 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4afbd6d6-aac3-4d68-be88-76917639058c-calico-apiserver-certs\") pod \"calico-apiserver-6f5b8bcc75-trldf\" (UID: \"4afbd6d6-aac3-4d68-be88-76917639058c\") " pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" Dec 16 12:54:22.838323 kubelet[2807]: I1216 12:54:22.838279 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5b0dddf-d30f-46e4-b43e-4240324fba15-config-volume\") pod \"coredns-668d6bf9bc-cn7m7\" (UID: \"f5b0dddf-d30f-46e4-b43e-4240324fba15\") " pod="kube-system/coredns-668d6bf9bc-cn7m7" Dec 16 12:54:22.838545 kubelet[2807]: I1216 12:54:22.838486 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfw6z\" (UniqueName: \"kubernetes.io/projected/942b94f4-174b-4d9a-b7f3-55c25ca8719b-kube-api-access-dfw6z\") pod \"whisker-8668c554d5-nkhb4\" (UID: \"942b94f4-174b-4d9a-b7f3-55c25ca8719b\") " pod="calico-system/whisker-8668c554d5-nkhb4" Dec 16 12:54:22.838545 kubelet[2807]: I1216 12:54:22.838514 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9kdg\" (UniqueName: \"kubernetes.io/projected/f5b0dddf-d30f-46e4-b43e-4240324fba15-kube-api-access-s9kdg\") pod \"coredns-668d6bf9bc-cn7m7\" (UID: \"f5b0dddf-d30f-46e4-b43e-4240324fba15\") " pod="kube-system/coredns-668d6bf9bc-cn7m7" Dec 16 12:54:22.838777 kubelet[2807]: I1216 12:54:22.838701 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83624a12-e59b-4753-81b9-815a3846bf01-config\") pod \"goldmane-666569f655-9bvxr\" (UID: \"83624a12-e59b-4753-81b9-815a3846bf01\") " pod="calico-system/goldmane-666569f655-9bvxr" Dec 16 12:54:22.840604 kubelet[2807]: I1216 12:54:22.839177 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83624a12-e59b-4753-81b9-815a3846bf01-goldmane-ca-bundle\") pod \"goldmane-666569f655-9bvxr\" (UID: \"83624a12-e59b-4753-81b9-815a3846bf01\") " pod="calico-system/goldmane-666569f655-9bvxr" Dec 16 12:54:22.840604 kubelet[2807]: I1216 12:54:22.839207 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bft4\" (UniqueName: \"kubernetes.io/projected/4afbd6d6-aac3-4d68-be88-76917639058c-kube-api-access-4bft4\") pod \"calico-apiserver-6f5b8bcc75-trldf\" (UID: \"4afbd6d6-aac3-4d68-be88-76917639058c\") " pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" Dec 16 12:54:22.840604 kubelet[2807]: I1216 12:54:22.839222 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk96r\" (UniqueName: \"kubernetes.io/projected/0318a864-5985-4f05-83eb-6e5fed8acf7e-kube-api-access-pk96r\") pod \"calico-kube-controllers-7cc84b5-xz9fx\" (UID: \"0318a864-5985-4f05-83eb-6e5fed8acf7e\") " pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" Dec 16 12:54:22.840604 kubelet[2807]: I1216 12:54:22.840242 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7255bf38-4b45-44eb-b0b0-8e800109b0ec-config-volume\") pod \"coredns-668d6bf9bc-5fjr9\" (UID: \"7255bf38-4b45-44eb-b0b0-8e800109b0ec\") " pod="kube-system/coredns-668d6bf9bc-5fjr9" Dec 16 12:54:22.840604 kubelet[2807]: I1216 12:54:22.840334 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gp6\" (UniqueName: \"kubernetes.io/projected/83624a12-e59b-4753-81b9-815a3846bf01-kube-api-access-m7gp6\") pod \"goldmane-666569f655-9bvxr\" (UID: \"83624a12-e59b-4753-81b9-815a3846bf01\") " pod="calico-system/goldmane-666569f655-9bvxr" Dec 16 12:54:22.840713 kubelet[2807]: I1216 12:54:22.840364 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0318a864-5985-4f05-83eb-6e5fed8acf7e-tigera-ca-bundle\") pod \"calico-kube-controllers-7cc84b5-xz9fx\" (UID: \"0318a864-5985-4f05-83eb-6e5fed8acf7e\") " pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" Dec 16 12:54:22.840713 kubelet[2807]: I1216 12:54:22.840386 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/942b94f4-174b-4d9a-b7f3-55c25ca8719b-whisker-ca-bundle\") pod \"whisker-8668c554d5-nkhb4\" (UID: \"942b94f4-174b-4d9a-b7f3-55c25ca8719b\") " pod="calico-system/whisker-8668c554d5-nkhb4" Dec 16 12:54:22.840713 kubelet[2807]: I1216 12:54:22.840414 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9345e167-5638-4038-a959-3d55222d2d5c-calico-apiserver-certs\") pod \"calico-apiserver-6f5b8bcc75-wg5dd\" (UID: \"9345e167-5638-4038-a959-3d55222d2d5c\") " pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" Dec 16 12:54:22.840713 kubelet[2807]: I1216 12:54:22.840436 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/942b94f4-174b-4d9a-b7f3-55c25ca8719b-whisker-backend-key-pair\") pod \"whisker-8668c554d5-nkhb4\" (UID: \"942b94f4-174b-4d9a-b7f3-55c25ca8719b\") " pod="calico-system/whisker-8668c554d5-nkhb4" Dec 16 12:54:22.848902 systemd[1]: Created slice kubepods-burstable-podf5b0dddf_d30f_46e4_b43e_4240324fba15.slice - libcontainer container kubepods-burstable-podf5b0dddf_d30f_46e4_b43e_4240324fba15.slice. Dec 16 12:54:22.859045 systemd[1]: Created slice kubepods-burstable-pod7255bf38_4b45_44eb_b0b0_8e800109b0ec.slice - libcontainer container kubepods-burstable-pod7255bf38_4b45_44eb_b0b0_8e800109b0ec.slice. Dec 16 12:54:22.865088 systemd[1]: Created slice kubepods-besteffort-pod0318a864_5985_4f05_83eb_6e5fed8acf7e.slice - libcontainer container kubepods-besteffort-pod0318a864_5985_4f05_83eb_6e5fed8acf7e.slice. Dec 16 12:54:22.870663 systemd[1]: Created slice kubepods-besteffort-pod4afbd6d6_aac3_4d68_be88_76917639058c.slice - libcontainer container kubepods-besteffort-pod4afbd6d6_aac3_4d68_be88_76917639058c.slice. Dec 16 12:54:22.877990 systemd[1]: Created slice kubepods-besteffort-pod9345e167_5638_4038_a959_3d55222d2d5c.slice - libcontainer container kubepods-besteffort-pod9345e167_5638_4038_a959_3d55222d2d5c.slice. Dec 16 12:54:23.122968 containerd[1649]: time="2025-12-16T12:54:23.122869402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9bvxr,Uid:83624a12-e59b-4753-81b9-815a3846bf01,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:23.155851 containerd[1649]: time="2025-12-16T12:54:23.155406222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8668c554d5-nkhb4,Uid:942b94f4-174b-4d9a-b7f3-55c25ca8719b,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:23.156768 containerd[1649]: time="2025-12-16T12:54:23.156716693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cn7m7,Uid:f5b0dddf-d30f-46e4-b43e-4240324fba15,Namespace:kube-system,Attempt:0,}" Dec 16 12:54:23.163112 containerd[1649]: time="2025-12-16T12:54:23.163074986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5fjr9,Uid:7255bf38-4b45-44eb-b0b0-8e800109b0ec,Namespace:kube-system,Attempt:0,}" Dec 16 12:54:23.169727 containerd[1649]: time="2025-12-16T12:54:23.169656750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cc84b5-xz9fx,Uid:0318a864-5985-4f05-83eb-6e5fed8acf7e,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:23.183632 containerd[1649]: time="2025-12-16T12:54:23.183486174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f5b8bcc75-wg5dd,Uid:9345e167-5638-4038-a959-3d55222d2d5c,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:54:23.186344 containerd[1649]: time="2025-12-16T12:54:23.186319083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f5b8bcc75-trldf,Uid:4afbd6d6-aac3-4d68-be88-76917639058c,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:54:23.375458 containerd[1649]: time="2025-12-16T12:54:23.374821027Z" level=error msg="Failed to destroy network for sandbox \"add3dc265439c3bda00ccdc7986ef37cc4041d59cfaf3ffea2cd7824e2bb9a66\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.375950 containerd[1649]: time="2025-12-16T12:54:23.375920700Z" level=error msg="Failed to destroy network for sandbox \"0d983febca732f3ff1b9af4b6bfcb8cca056fc025ed9add1f69e08eb9f6b7ace\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.377450 containerd[1649]: time="2025-12-16T12:54:23.377416949Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f5b8bcc75-wg5dd,Uid:9345e167-5638-4038-a959-3d55222d2d5c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"add3dc265439c3bda00ccdc7986ef37cc4041d59cfaf3ffea2cd7824e2bb9a66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.377920 containerd[1649]: time="2025-12-16T12:54:23.377894990Z" level=error msg="Failed to destroy network for sandbox \"4f47c7832e055e22a5cae4e39ada55b7472a1478b282635d76e7f6cd4a511fd5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.382870 containerd[1649]: time="2025-12-16T12:54:23.382113931Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9bvxr,Uid:83624a12-e59b-4753-81b9-815a3846bf01,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d983febca732f3ff1b9af4b6bfcb8cca056fc025ed9add1f69e08eb9f6b7ace\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.385117 containerd[1649]: time="2025-12-16T12:54:23.385022353Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5fjr9,Uid:7255bf38-4b45-44eb-b0b0-8e800109b0ec,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4f47c7832e055e22a5cae4e39ada55b7472a1478b282635d76e7f6cd4a511fd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.386727 kubelet[2807]: E1216 12:54:23.386207 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"add3dc265439c3bda00ccdc7986ef37cc4041d59cfaf3ffea2cd7824e2bb9a66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.387172 kubelet[2807]: E1216 12:54:23.386701 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d983febca732f3ff1b9af4b6bfcb8cca056fc025ed9add1f69e08eb9f6b7ace\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.388759 kubelet[2807]: E1216 12:54:23.388697 2807 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"add3dc265439c3bda00ccdc7986ef37cc4041d59cfaf3ffea2cd7824e2bb9a66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" Dec 16 12:54:23.388759 kubelet[2807]: E1216 12:54:23.388734 2807 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"add3dc265439c3bda00ccdc7986ef37cc4041d59cfaf3ffea2cd7824e2bb9a66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" Dec 16 12:54:23.389034 kubelet[2807]: E1216 12:54:23.388860 2807 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d983febca732f3ff1b9af4b6bfcb8cca056fc025ed9add1f69e08eb9f6b7ace\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-9bvxr" Dec 16 12:54:23.389034 kubelet[2807]: E1216 12:54:23.388889 2807 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d983febca732f3ff1b9af4b6bfcb8cca056fc025ed9add1f69e08eb9f6b7ace\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-9bvxr" Dec 16 12:54:23.389979 kubelet[2807]: E1216 12:54:23.389611 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-9bvxr_calico-system(83624a12-e59b-4753-81b9-815a3846bf01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-9bvxr_calico-system(83624a12-e59b-4753-81b9-815a3846bf01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d983febca732f3ff1b9af4b6bfcb8cca056fc025ed9add1f69e08eb9f6b7ace\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:54:23.391199 kubelet[2807]: E1216 12:54:23.390275 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f5b8bcc75-wg5dd_calico-apiserver(9345e167-5638-4038-a959-3d55222d2d5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f5b8bcc75-wg5dd_calico-apiserver(9345e167-5638-4038-a959-3d55222d2d5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"add3dc265439c3bda00ccdc7986ef37cc4041d59cfaf3ffea2cd7824e2bb9a66\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:54:23.391199 kubelet[2807]: E1216 12:54:23.390416 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4f47c7832e055e22a5cae4e39ada55b7472a1478b282635d76e7f6cd4a511fd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.391199 kubelet[2807]: E1216 12:54:23.390444 2807 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4f47c7832e055e22a5cae4e39ada55b7472a1478b282635d76e7f6cd4a511fd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-5fjr9" Dec 16 12:54:23.391310 kubelet[2807]: E1216 12:54:23.390456 2807 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4f47c7832e055e22a5cae4e39ada55b7472a1478b282635d76e7f6cd4a511fd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-5fjr9" Dec 16 12:54:23.391310 kubelet[2807]: E1216 12:54:23.390477 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-5fjr9_kube-system(7255bf38-4b45-44eb-b0b0-8e800109b0ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-5fjr9_kube-system(7255bf38-4b45-44eb-b0b0-8e800109b0ec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4f47c7832e055e22a5cae4e39ada55b7472a1478b282635d76e7f6cd4a511fd5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-5fjr9" podUID="7255bf38-4b45-44eb-b0b0-8e800109b0ec" Dec 16 12:54:23.396771 containerd[1649]: time="2025-12-16T12:54:23.396734277Z" level=error msg="Failed to destroy network for sandbox \"72297cab4adaa2af74fc459a629bad0351061f64cc9f655a948cdb8801b1e1d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.398585 containerd[1649]: time="2025-12-16T12:54:23.398552052Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f5b8bcc75-trldf,Uid:4afbd6d6-aac3-4d68-be88-76917639058c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"72297cab4adaa2af74fc459a629bad0351061f64cc9f655a948cdb8801b1e1d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.398751 kubelet[2807]: E1216 12:54:23.398731 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72297cab4adaa2af74fc459a629bad0351061f64cc9f655a948cdb8801b1e1d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.398854 kubelet[2807]: E1216 12:54:23.398841 2807 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72297cab4adaa2af74fc459a629bad0351061f64cc9f655a948cdb8801b1e1d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" Dec 16 12:54:23.400113 kubelet[2807]: E1216 12:54:23.400076 2807 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72297cab4adaa2af74fc459a629bad0351061f64cc9f655a948cdb8801b1e1d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" Dec 16 12:54:23.400113 kubelet[2807]: E1216 12:54:23.400122 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f5b8bcc75-trldf_calico-apiserver(4afbd6d6-aac3-4d68-be88-76917639058c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f5b8bcc75-trldf_calico-apiserver(4afbd6d6-aac3-4d68-be88-76917639058c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72297cab4adaa2af74fc459a629bad0351061f64cc9f655a948cdb8801b1e1d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:54:23.402850 containerd[1649]: time="2025-12-16T12:54:23.402782054Z" level=error msg="Failed to destroy network for sandbox \"747c832450f8996364299cb73a84ed119e68b7c8f2eb7b4a07b340c13e3be9e2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.407340 containerd[1649]: time="2025-12-16T12:54:23.407293426Z" level=error msg="Failed to destroy network for sandbox \"fa0445e72d425170b2b9c456dda8b7dca65e40fc7f00b0d8375da6175c56b58f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.407521 containerd[1649]: time="2025-12-16T12:54:23.407494115Z" level=error msg="Failed to destroy network for sandbox \"f53aa9fe33d8fd46422940b7760eae04dbe5e6543571dab5760753550148fa41\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.409224 containerd[1649]: time="2025-12-16T12:54:23.409077399Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cc84b5-xz9fx,Uid:0318a864-5985-4f05-83eb-6e5fed8acf7e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f53aa9fe33d8fd46422940b7760eae04dbe5e6543571dab5760753550148fa41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.409778 kubelet[2807]: E1216 12:54:23.409664 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f53aa9fe33d8fd46422940b7760eae04dbe5e6543571dab5760753550148fa41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.409815 kubelet[2807]: E1216 12:54:23.409783 2807 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f53aa9fe33d8fd46422940b7760eae04dbe5e6543571dab5760753550148fa41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" Dec 16 12:54:23.409815 kubelet[2807]: E1216 12:54:23.409803 2807 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f53aa9fe33d8fd46422940b7760eae04dbe5e6543571dab5760753550148fa41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" Dec 16 12:54:23.409864 kubelet[2807]: E1216 12:54:23.409831 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cc84b5-xz9fx_calico-system(0318a864-5985-4f05-83eb-6e5fed8acf7e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cc84b5-xz9fx_calico-system(0318a864-5985-4f05-83eb-6e5fed8acf7e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f53aa9fe33d8fd46422940b7760eae04dbe5e6543571dab5760753550148fa41\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:54:23.410073 containerd[1649]: time="2025-12-16T12:54:23.410041747Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8668c554d5-nkhb4,Uid:942b94f4-174b-4d9a-b7f3-55c25ca8719b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"747c832450f8996364299cb73a84ed119e68b7c8f2eb7b4a07b340c13e3be9e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.410711 kubelet[2807]: E1216 12:54:23.410285 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"747c832450f8996364299cb73a84ed119e68b7c8f2eb7b4a07b340c13e3be9e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.410711 kubelet[2807]: E1216 12:54:23.410344 2807 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"747c832450f8996364299cb73a84ed119e68b7c8f2eb7b4a07b340c13e3be9e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-8668c554d5-nkhb4" Dec 16 12:54:23.410711 kubelet[2807]: E1216 12:54:23.410358 2807 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"747c832450f8996364299cb73a84ed119e68b7c8f2eb7b4a07b340c13e3be9e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-8668c554d5-nkhb4" Dec 16 12:54:23.410797 kubelet[2807]: E1216 12:54:23.410382 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-8668c554d5-nkhb4_calico-system(942b94f4-174b-4d9a-b7f3-55c25ca8719b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-8668c554d5-nkhb4_calico-system(942b94f4-174b-4d9a-b7f3-55c25ca8719b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"747c832450f8996364299cb73a84ed119e68b7c8f2eb7b4a07b340c13e3be9e2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-8668c554d5-nkhb4" podUID="942b94f4-174b-4d9a-b7f3-55c25ca8719b" Dec 16 12:54:23.411677 containerd[1649]: time="2025-12-16T12:54:23.411604311Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cn7m7,Uid:f5b0dddf-d30f-46e4-b43e-4240324fba15,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa0445e72d425170b2b9c456dda8b7dca65e40fc7f00b0d8375da6175c56b58f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.411788 kubelet[2807]: E1216 12:54:23.411751 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa0445e72d425170b2b9c456dda8b7dca65e40fc7f00b0d8375da6175c56b58f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:23.411837 kubelet[2807]: E1216 12:54:23.411785 2807 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa0445e72d425170b2b9c456dda8b7dca65e40fc7f00b0d8375da6175c56b58f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-cn7m7" Dec 16 12:54:23.411837 kubelet[2807]: E1216 12:54:23.411803 2807 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa0445e72d425170b2b9c456dda8b7dca65e40fc7f00b0d8375da6175c56b58f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-cn7m7" Dec 16 12:54:23.411837 kubelet[2807]: E1216 12:54:23.411826 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-cn7m7_kube-system(f5b0dddf-d30f-46e4-b43e-4240324fba15)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-cn7m7_kube-system(f5b0dddf-d30f-46e4-b43e-4240324fba15)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fa0445e72d425170b2b9c456dda8b7dca65e40fc7f00b0d8375da6175c56b58f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-cn7m7" podUID="f5b0dddf-d30f-46e4-b43e-4240324fba15" Dec 16 12:54:23.506032 containerd[1649]: time="2025-12-16T12:54:23.505849312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 12:54:24.077406 systemd[1]: run-netns-cni\x2d41226634\x2de346\x2d674a\x2de637\x2db4929dad2412.mount: Deactivated successfully. Dec 16 12:54:24.077551 systemd[1]: run-netns-cni\x2d3537fdbf\x2d2de8\x2dc134\x2deed9\x2dd9ea8cd80cb6.mount: Deactivated successfully. Dec 16 12:54:24.077649 systemd[1]: run-netns-cni\x2d6d3ca4bc\x2d49af\x2d60b5\x2d7f69\x2d64c3b9c20cbc.mount: Deactivated successfully. Dec 16 12:54:24.077752 systemd[1]: run-netns-cni\x2db7057bb1\x2dbfcd\x2df8b7\x2d30bb\x2d85b57f029ea3.mount: Deactivated successfully. Dec 16 12:54:24.319995 systemd[1]: Created slice kubepods-besteffort-podb5e01eba_2e7b_44aa_9650_696a129f0a90.slice - libcontainer container kubepods-besteffort-podb5e01eba_2e7b_44aa_9650_696a129f0a90.slice. Dec 16 12:54:24.323753 containerd[1649]: time="2025-12-16T12:54:24.323688612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xdkpf,Uid:b5e01eba-2e7b-44aa-9650-696a129f0a90,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:24.403029 containerd[1649]: time="2025-12-16T12:54:24.400764961Z" level=error msg="Failed to destroy network for sandbox \"0f56a27f0c851f2c657041f836cb03ba5c9f94ce9430f4c4216809bd3c5fd6fc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:24.402752 systemd[1]: run-netns-cni\x2d06098b53\x2df6a3\x2d469c\x2d81df\x2d36327391bddf.mount: Deactivated successfully. Dec 16 12:54:24.405556 containerd[1649]: time="2025-12-16T12:54:24.405499111Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xdkpf,Uid:b5e01eba-2e7b-44aa-9650-696a129f0a90,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f56a27f0c851f2c657041f836cb03ba5c9f94ce9430f4c4216809bd3c5fd6fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:24.405878 kubelet[2807]: E1216 12:54:24.405835 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f56a27f0c851f2c657041f836cb03ba5c9f94ce9430f4c4216809bd3c5fd6fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:54:24.406149 kubelet[2807]: E1216 12:54:24.405896 2807 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f56a27f0c851f2c657041f836cb03ba5c9f94ce9430f4c4216809bd3c5fd6fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xdkpf" Dec 16 12:54:24.406149 kubelet[2807]: E1216 12:54:24.405918 2807 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f56a27f0c851f2c657041f836cb03ba5c9f94ce9430f4c4216809bd3c5fd6fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xdkpf" Dec 16 12:54:24.406149 kubelet[2807]: E1216 12:54:24.405975 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0f56a27f0c851f2c657041f836cb03ba5c9f94ce9430f4c4216809bd3c5fd6fc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:54:27.614680 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2824431024.mount: Deactivated successfully. Dec 16 12:54:27.671614 containerd[1649]: time="2025-12-16T12:54:27.668548672Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Dec 16 12:54:27.673928 containerd[1649]: time="2025-12-16T12:54:27.673894757Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 4.168000851s" Dec 16 12:54:27.674789 containerd[1649]: time="2025-12-16T12:54:27.661089369Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:27.674789 containerd[1649]: time="2025-12-16T12:54:27.674629920Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:27.675632 containerd[1649]: time="2025-12-16T12:54:27.674943661Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:54:27.679423 containerd[1649]: time="2025-12-16T12:54:27.679381145Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 16 12:54:27.708216 containerd[1649]: time="2025-12-16T12:54:27.708179100Z" level=info msg="CreateContainer within sandbox \"8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 12:54:27.719485 kubelet[2807]: I1216 12:54:27.719457 2807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:54:27.773491 containerd[1649]: time="2025-12-16T12:54:27.773179768Z" level=info msg="Container c854840dfe0d41b99e098d875ae2a30b87cce30317b476661007b5d41db4364a: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:54:27.774433 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1078010167.mount: Deactivated successfully. Dec 16 12:54:27.821655 containerd[1649]: time="2025-12-16T12:54:27.821573433Z" level=info msg="CreateContainer within sandbox \"8696c858eca8b488e242869002ac6e968ada8dd2dabf2400f8d94a0eb31493ed\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c854840dfe0d41b99e098d875ae2a30b87cce30317b476661007b5d41db4364a\"" Dec 16 12:54:27.822822 containerd[1649]: time="2025-12-16T12:54:27.822754728Z" level=info msg="StartContainer for \"c854840dfe0d41b99e098d875ae2a30b87cce30317b476661007b5d41db4364a\"" Dec 16 12:54:27.826087 containerd[1649]: time="2025-12-16T12:54:27.826060982Z" level=info msg="connecting to shim c854840dfe0d41b99e098d875ae2a30b87cce30317b476661007b5d41db4364a" address="unix:///run/containerd/s/467cd46aeeef9b6d55a02d4c2c858856d3e81ce155b1f4b821fde121dbde8775" protocol=ttrpc version=3 Dec 16 12:54:27.836614 kernel: kauditd_printk_skb: 6 callbacks suppressed Dec 16 12:54:27.837878 kernel: audit: type=1325 audit(1765889667.829:580): table=filter:117 family=2 entries=21 op=nft_register_rule pid=3873 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:27.829000 audit[3873]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3873 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:27.829000 audit[3873]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcac165f40 a2=0 a3=7ffcac165f2c items=0 ppid=2949 pid=3873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.849108 kernel: audit: type=1300 audit(1765889667.829:580): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcac165f40 a2=0 a3=7ffcac165f2c items=0 ppid=2949 pid=3873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.849150 kernel: audit: type=1327 audit(1765889667.829:580): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:27.829000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:27.837000 audit[3873]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3873 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:27.866228 kernel: audit: type=1325 audit(1765889667.837:581): table=nat:118 family=2 entries=19 op=nft_register_chain pid=3873 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:27.866275 kernel: audit: type=1300 audit(1765889667.837:581): arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffcac165f40 a2=0 a3=7ffcac165f2c items=0 ppid=2949 pid=3873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.837000 audit[3873]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffcac165f40 a2=0 a3=7ffcac165f2c items=0 ppid=2949 pid=3873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.837000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:27.872184 kernel: audit: type=1327 audit(1765889667.837:581): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:27.926347 systemd[1]: Started cri-containerd-c854840dfe0d41b99e098d875ae2a30b87cce30317b476661007b5d41db4364a.scope - libcontainer container c854840dfe0d41b99e098d875ae2a30b87cce30317b476661007b5d41db4364a. Dec 16 12:54:27.977000 audit: BPF prog-id=177 op=LOAD Dec 16 12:54:27.977000 audit[3874]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=3408 pid=3874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.984185 kernel: audit: type=1334 audit(1765889667.977:582): prog-id=177 op=LOAD Dec 16 12:54:27.984244 kernel: audit: type=1300 audit(1765889667.977:582): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=3408 pid=3874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353438343064666530643431623939653039386438373561653261 Dec 16 12:54:27.994626 kernel: audit: type=1327 audit(1765889667.977:582): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353438343064666530643431623939653039386438373561653261 Dec 16 12:54:27.977000 audit: BPF prog-id=178 op=LOAD Dec 16 12:54:28.002087 kernel: audit: type=1334 audit(1765889667.977:583): prog-id=178 op=LOAD Dec 16 12:54:27.977000 audit[3874]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=3408 pid=3874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353438343064666530643431623939653039386438373561653261 Dec 16 12:54:27.977000 audit: BPF prog-id=178 op=UNLOAD Dec 16 12:54:27.977000 audit[3874]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353438343064666530643431623939653039386438373561653261 Dec 16 12:54:27.977000 audit: BPF prog-id=177 op=UNLOAD Dec 16 12:54:27.977000 audit[3874]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3408 pid=3874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353438343064666530643431623939653039386438373561653261 Dec 16 12:54:27.977000 audit: BPF prog-id=179 op=LOAD Dec 16 12:54:27.977000 audit[3874]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=3408 pid=3874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:27.977000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338353438343064666530643431623939653039386438373561653261 Dec 16 12:54:28.030496 containerd[1649]: time="2025-12-16T12:54:28.030414563Z" level=info msg="StartContainer for \"c854840dfe0d41b99e098d875ae2a30b87cce30317b476661007b5d41db4364a\" returns successfully" Dec 16 12:54:28.243051 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 12:54:28.243192 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 12:54:28.495448 kubelet[2807]: I1216 12:54:28.495282 2807 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/942b94f4-174b-4d9a-b7f3-55c25ca8719b-whisker-ca-bundle\") pod \"942b94f4-174b-4d9a-b7f3-55c25ca8719b\" (UID: \"942b94f4-174b-4d9a-b7f3-55c25ca8719b\") " Dec 16 12:54:28.496220 kubelet[2807]: I1216 12:54:28.496097 2807 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfw6z\" (UniqueName: \"kubernetes.io/projected/942b94f4-174b-4d9a-b7f3-55c25ca8719b-kube-api-access-dfw6z\") pod \"942b94f4-174b-4d9a-b7f3-55c25ca8719b\" (UID: \"942b94f4-174b-4d9a-b7f3-55c25ca8719b\") " Dec 16 12:54:28.496451 kubelet[2807]: I1216 12:54:28.496191 2807 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/942b94f4-174b-4d9a-b7f3-55c25ca8719b-whisker-backend-key-pair\") pod \"942b94f4-174b-4d9a-b7f3-55c25ca8719b\" (UID: \"942b94f4-174b-4d9a-b7f3-55c25ca8719b\") " Dec 16 12:54:28.500980 kubelet[2807]: I1216 12:54:28.500098 2807 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/942b94f4-174b-4d9a-b7f3-55c25ca8719b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "942b94f4-174b-4d9a-b7f3-55c25ca8719b" (UID: "942b94f4-174b-4d9a-b7f3-55c25ca8719b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 12:54:28.514087 kubelet[2807]: I1216 12:54:28.514062 2807 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942b94f4-174b-4d9a-b7f3-55c25ca8719b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "942b94f4-174b-4d9a-b7f3-55c25ca8719b" (UID: "942b94f4-174b-4d9a-b7f3-55c25ca8719b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 12:54:28.514240 kubelet[2807]: I1216 12:54:28.514010 2807 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942b94f4-174b-4d9a-b7f3-55c25ca8719b-kube-api-access-dfw6z" (OuterVolumeSpecName: "kube-api-access-dfw6z") pod "942b94f4-174b-4d9a-b7f3-55c25ca8719b" (UID: "942b94f4-174b-4d9a-b7f3-55c25ca8719b"). InnerVolumeSpecName "kube-api-access-dfw6z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 12:54:28.600196 kubelet[2807]: I1216 12:54:28.599669 2807 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dfw6z\" (UniqueName: \"kubernetes.io/projected/942b94f4-174b-4d9a-b7f3-55c25ca8719b-kube-api-access-dfw6z\") on node \"ci-4515-1-0-8-2e3d7ab7bb\" DevicePath \"\"" Dec 16 12:54:28.600196 kubelet[2807]: I1216 12:54:28.599708 2807 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/942b94f4-174b-4d9a-b7f3-55c25ca8719b-whisker-backend-key-pair\") on node \"ci-4515-1-0-8-2e3d7ab7bb\" DevicePath \"\"" Dec 16 12:54:28.600196 kubelet[2807]: I1216 12:54:28.599721 2807 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/942b94f4-174b-4d9a-b7f3-55c25ca8719b-whisker-ca-bundle\") on node \"ci-4515-1-0-8-2e3d7ab7bb\" DevicePath \"\"" Dec 16 12:54:28.615491 systemd[1]: var-lib-kubelet-pods-942b94f4\x2d174b\x2d4d9a\x2db7f3\x2d55c25ca8719b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddfw6z.mount: Deactivated successfully. Dec 16 12:54:28.615565 systemd[1]: var-lib-kubelet-pods-942b94f4\x2d174b\x2d4d9a\x2db7f3\x2d55c25ca8719b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 12:54:28.643719 systemd[1]: Removed slice kubepods-besteffort-pod942b94f4_174b_4d9a_b7f3_55c25ca8719b.slice - libcontainer container kubepods-besteffort-pod942b94f4_174b_4d9a_b7f3_55c25ca8719b.slice. Dec 16 12:54:28.660035 kubelet[2807]: I1216 12:54:28.659963 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-ksf25" podStartSLOduration=1.5561990570000002 podStartE2EDuration="14.659947032s" podCreationTimestamp="2025-12-16 12:54:14 +0000 UTC" firstStartedPulling="2025-12-16 12:54:14.581085385 +0000 UTC m=+22.366739387" lastFinishedPulling="2025-12-16 12:54:27.684833359 +0000 UTC m=+35.470487362" observedRunningTime="2025-12-16 12:54:28.654699094 +0000 UTC m=+36.440353107" watchObservedRunningTime="2025-12-16 12:54:28.659947032 +0000 UTC m=+36.445601045" Dec 16 12:54:28.727843 systemd[1]: Created slice kubepods-besteffort-pod662f30c6_4ed6_44dc_96b4_74080eea2751.slice - libcontainer container kubepods-besteffort-pod662f30c6_4ed6_44dc_96b4_74080eea2751.slice. Dec 16 12:54:28.801855 kubelet[2807]: I1216 12:54:28.801401 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f30c6-4ed6-44dc-96b4-74080eea2751-whisker-ca-bundle\") pod \"whisker-6746c4dc5c-qzpt8\" (UID: \"662f30c6-4ed6-44dc-96b4-74080eea2751\") " pod="calico-system/whisker-6746c4dc5c-qzpt8" Dec 16 12:54:28.801855 kubelet[2807]: I1216 12:54:28.801469 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv486\" (UniqueName: \"kubernetes.io/projected/662f30c6-4ed6-44dc-96b4-74080eea2751-kube-api-access-nv486\") pod \"whisker-6746c4dc5c-qzpt8\" (UID: \"662f30c6-4ed6-44dc-96b4-74080eea2751\") " pod="calico-system/whisker-6746c4dc5c-qzpt8" Dec 16 12:54:28.801855 kubelet[2807]: I1216 12:54:28.801491 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/662f30c6-4ed6-44dc-96b4-74080eea2751-whisker-backend-key-pair\") pod \"whisker-6746c4dc5c-qzpt8\" (UID: \"662f30c6-4ed6-44dc-96b4-74080eea2751\") " pod="calico-system/whisker-6746c4dc5c-qzpt8" Dec 16 12:54:29.036312 containerd[1649]: time="2025-12-16T12:54:29.036256856Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6746c4dc5c-qzpt8,Uid:662f30c6-4ed6-44dc-96b4-74080eea2751,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:29.294471 systemd-networkd[1544]: cali90dd5cadb59: Link UP Dec 16 12:54:29.295390 systemd-networkd[1544]: cali90dd5cadb59: Gained carrier Dec 16 12:54:29.307455 containerd[1649]: 2025-12-16 12:54:29.079 [INFO][3949] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:54:29.307455 containerd[1649]: 2025-12-16 12:54:29.105 [INFO][3949] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0 whisker-6746c4dc5c- calico-system 662f30c6-4ed6-44dc-96b4-74080eea2751 904 0 2025-12-16 12:54:28 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6746c4dc5c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4515-1-0-8-2e3d7ab7bb whisker-6746c4dc5c-qzpt8 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali90dd5cadb59 [] [] }} ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Namespace="calico-system" Pod="whisker-6746c4dc5c-qzpt8" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-" Dec 16 12:54:29.307455 containerd[1649]: 2025-12-16 12:54:29.106 [INFO][3949] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Namespace="calico-system" Pod="whisker-6746c4dc5c-qzpt8" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" Dec 16 12:54:29.307455 containerd[1649]: 2025-12-16 12:54:29.239 [INFO][3958] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" HandleID="k8s-pod-network.e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" Dec 16 12:54:29.307931 containerd[1649]: 2025-12-16 12:54:29.243 [INFO][3958] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" HandleID="k8s-pod-network.e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000353870), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-8-2e3d7ab7bb", "pod":"whisker-6746c4dc5c-qzpt8", "timestamp":"2025-12-16 12:54:29.239373526 +0000 UTC"}, Hostname:"ci-4515-1-0-8-2e3d7ab7bb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:54:29.307931 containerd[1649]: 2025-12-16 12:54:29.243 [INFO][3958] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:54:29.307931 containerd[1649]: 2025-12-16 12:54:29.244 [INFO][3958] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:54:29.307931 containerd[1649]: 2025-12-16 12:54:29.244 [INFO][3958] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-8-2e3d7ab7bb' Dec 16 12:54:29.307931 containerd[1649]: 2025-12-16 12:54:29.255 [INFO][3958] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:29.307931 containerd[1649]: 2025-12-16 12:54:29.263 [INFO][3958] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:29.307931 containerd[1649]: 2025-12-16 12:54:29.267 [INFO][3958] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:29.307931 containerd[1649]: 2025-12-16 12:54:29.269 [INFO][3958] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:29.307931 containerd[1649]: 2025-12-16 12:54:29.270 [INFO][3958] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:29.308414 containerd[1649]: 2025-12-16 12:54:29.270 [INFO][3958] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:29.308414 containerd[1649]: 2025-12-16 12:54:29.271 [INFO][3958] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250 Dec 16 12:54:29.308414 containerd[1649]: 2025-12-16 12:54:29.276 [INFO][3958] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:29.308414 containerd[1649]: 2025-12-16 12:54:29.280 [INFO][3958] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.1/26] block=192.168.106.0/26 handle="k8s-pod-network.e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:29.308414 containerd[1649]: 2025-12-16 12:54:29.280 [INFO][3958] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.1/26] handle="k8s-pod-network.e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:29.308414 containerd[1649]: 2025-12-16 12:54:29.280 [INFO][3958] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:54:29.308414 containerd[1649]: 2025-12-16 12:54:29.280 [INFO][3958] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.1/26] IPv6=[] ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" HandleID="k8s-pod-network.e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" Dec 16 12:54:29.308525 containerd[1649]: 2025-12-16 12:54:29.282 [INFO][3949] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Namespace="calico-system" Pod="whisker-6746c4dc5c-qzpt8" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0", GenerateName:"whisker-6746c4dc5c-", Namespace:"calico-system", SelfLink:"", UID:"662f30c6-4ed6-44dc-96b4-74080eea2751", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6746c4dc5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"", Pod:"whisker-6746c4dc5c-qzpt8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.106.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali90dd5cadb59", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:29.308525 containerd[1649]: 2025-12-16 12:54:29.282 [INFO][3949] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.1/32] ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Namespace="calico-system" Pod="whisker-6746c4dc5c-qzpt8" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" Dec 16 12:54:29.308594 containerd[1649]: 2025-12-16 12:54:29.282 [INFO][3949] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali90dd5cadb59 ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Namespace="calico-system" Pod="whisker-6746c4dc5c-qzpt8" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" Dec 16 12:54:29.308594 containerd[1649]: 2025-12-16 12:54:29.290 [INFO][3949] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Namespace="calico-system" Pod="whisker-6746c4dc5c-qzpt8" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" Dec 16 12:54:29.309205 containerd[1649]: 2025-12-16 12:54:29.290 [INFO][3949] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Namespace="calico-system" Pod="whisker-6746c4dc5c-qzpt8" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0", GenerateName:"whisker-6746c4dc5c-", Namespace:"calico-system", SelfLink:"", UID:"662f30c6-4ed6-44dc-96b4-74080eea2751", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6746c4dc5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250", Pod:"whisker-6746c4dc5c-qzpt8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.106.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali90dd5cadb59", MAC:"16:fb:63:65:48:aa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:29.309303 containerd[1649]: 2025-12-16 12:54:29.301 [INFO][3949] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" Namespace="calico-system" Pod="whisker-6746c4dc5c-qzpt8" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-whisker--6746c4dc5c--qzpt8-eth0" Dec 16 12:54:29.487863 containerd[1649]: time="2025-12-16T12:54:29.487801860Z" level=info msg="connecting to shim e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250" address="unix:///run/containerd/s/47e41af938c4a1f41cb8fc0ac1e10dfbfab564910792ca39a1b542e4436a141d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:29.512361 systemd[1]: Started cri-containerd-e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250.scope - libcontainer container e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250. Dec 16 12:54:29.529000 audit: BPF prog-id=180 op=LOAD Dec 16 12:54:29.530000 audit: BPF prog-id=181 op=LOAD Dec 16 12:54:29.530000 audit[3990]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=3979 pid=3990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538623631363231363537356539646233313134306135633732623661 Dec 16 12:54:29.530000 audit: BPF prog-id=181 op=UNLOAD Dec 16 12:54:29.530000 audit[3990]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3979 pid=3990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538623631363231363537356539646233313134306135633732623661 Dec 16 12:54:29.530000 audit: BPF prog-id=182 op=LOAD Dec 16 12:54:29.530000 audit[3990]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3979 pid=3990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538623631363231363537356539646233313134306135633732623661 Dec 16 12:54:29.530000 audit: BPF prog-id=183 op=LOAD Dec 16 12:54:29.530000 audit[3990]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3979 pid=3990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538623631363231363537356539646233313134306135633732623661 Dec 16 12:54:29.530000 audit: BPF prog-id=183 op=UNLOAD Dec 16 12:54:29.530000 audit[3990]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3979 pid=3990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538623631363231363537356539646233313134306135633732623661 Dec 16 12:54:29.530000 audit: BPF prog-id=182 op=UNLOAD Dec 16 12:54:29.530000 audit[3990]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3979 pid=3990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538623631363231363537356539646233313134306135633732623661 Dec 16 12:54:29.530000 audit: BPF prog-id=184 op=LOAD Dec 16 12:54:29.530000 audit[3990]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=3979 pid=3990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.530000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6538623631363231363537356539646233313134306135633732623661 Dec 16 12:54:29.584776 containerd[1649]: time="2025-12-16T12:54:29.583285445Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6746c4dc5c-qzpt8,Uid:662f30c6-4ed6-44dc-96b4-74080eea2751,Namespace:calico-system,Attempt:0,} returns sandbox id \"e8b616216575e9db31140a5c72b6ab3327904fc9ebc5ad2d0e346303059d0250\"" Dec 16 12:54:29.593014 containerd[1649]: time="2025-12-16T12:54:29.592969988Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:54:29.642816 kubelet[2807]: I1216 12:54:29.642796 2807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:54:29.903000 audit: BPF prog-id=185 op=LOAD Dec 16 12:54:29.903000 audit[4137]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcd154a580 a2=98 a3=1fffffffffffffff items=0 ppid=4033 pid=4137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.903000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:54:29.903000 audit: BPF prog-id=185 op=UNLOAD Dec 16 12:54:29.903000 audit[4137]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffcd154a550 a3=0 items=0 ppid=4033 pid=4137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.903000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:54:29.903000 audit: BPF prog-id=186 op=LOAD Dec 16 12:54:29.903000 audit[4137]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcd154a460 a2=94 a3=3 items=0 ppid=4033 pid=4137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.903000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:54:29.903000 audit: BPF prog-id=186 op=UNLOAD Dec 16 12:54:29.903000 audit[4137]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffcd154a460 a2=94 a3=3 items=0 ppid=4033 pid=4137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.903000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:54:29.903000 audit: BPF prog-id=187 op=LOAD Dec 16 12:54:29.903000 audit[4137]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcd154a4a0 a2=94 a3=7ffcd154a680 items=0 ppid=4033 pid=4137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.903000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:54:29.903000 audit: BPF prog-id=187 op=UNLOAD Dec 16 12:54:29.903000 audit[4137]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffcd154a4a0 a2=94 a3=7ffcd154a680 items=0 ppid=4033 pid=4137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.903000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:54:29.905000 audit: BPF prog-id=188 op=LOAD Dec 16 12:54:29.905000 audit[4138]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe87d190c0 a2=98 a3=3 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.905000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:29.905000 audit: BPF prog-id=188 op=UNLOAD Dec 16 12:54:29.905000 audit[4138]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe87d19090 a3=0 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.905000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:29.906000 audit: BPF prog-id=189 op=LOAD Dec 16 12:54:29.906000 audit[4138]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe87d18eb0 a2=94 a3=54428f items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.906000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:29.906000 audit: BPF prog-id=189 op=UNLOAD Dec 16 12:54:29.906000 audit[4138]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe87d18eb0 a2=94 a3=54428f items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.906000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:29.906000 audit: BPF prog-id=190 op=LOAD Dec 16 12:54:29.906000 audit[4138]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe87d18ee0 a2=94 a3=2 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.906000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:29.906000 audit: BPF prog-id=190 op=UNLOAD Dec 16 12:54:29.906000 audit[4138]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe87d18ee0 a2=0 a3=2 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:29.906000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.034388 containerd[1649]: time="2025-12-16T12:54:30.034349461Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:30.035558 containerd[1649]: time="2025-12-16T12:54:30.035528088Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:54:30.035632 containerd[1649]: time="2025-12-16T12:54:30.035599212Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:30.035968 kubelet[2807]: E1216 12:54:30.035795 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:54:30.035968 kubelet[2807]: E1216 12:54:30.035834 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:54:30.037000 audit: BPF prog-id=191 op=LOAD Dec 16 12:54:30.037000 audit[4138]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe87d18da0 a2=94 a3=1 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.037000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.037000 audit: BPF prog-id=191 op=UNLOAD Dec 16 12:54:30.037000 audit[4138]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe87d18da0 a2=94 a3=1 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.037000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.046977 kubelet[2807]: E1216 12:54:30.046936 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:6195276ecb614f0fa525b9a7c33c407b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nv486,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6746c4dc5c-qzpt8_calico-system(662f30c6-4ed6-44dc-96b4-74080eea2751): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:30.046000 audit: BPF prog-id=192 op=LOAD Dec 16 12:54:30.046000 audit[4138]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe87d18d90 a2=94 a3=4 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.046000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.046000 audit: BPF prog-id=192 op=UNLOAD Dec 16 12:54:30.046000 audit[4138]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe87d18d90 a2=0 a3=4 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.046000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.047000 audit: BPF prog-id=193 op=LOAD Dec 16 12:54:30.047000 audit[4138]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe87d18bf0 a2=94 a3=5 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.047000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.047000 audit: BPF prog-id=193 op=UNLOAD Dec 16 12:54:30.047000 audit[4138]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffe87d18bf0 a2=0 a3=5 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.047000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.047000 audit: BPF prog-id=194 op=LOAD Dec 16 12:54:30.047000 audit[4138]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe87d18e10 a2=94 a3=6 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.047000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.047000 audit: BPF prog-id=194 op=UNLOAD Dec 16 12:54:30.047000 audit[4138]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe87d18e10 a2=0 a3=6 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.047000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.047000 audit: BPF prog-id=195 op=LOAD Dec 16 12:54:30.047000 audit[4138]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe87d185c0 a2=94 a3=88 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.047000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.047000 audit: BPF prog-id=196 op=LOAD Dec 16 12:54:30.047000 audit[4138]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffe87d18440 a2=94 a3=2 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.047000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.047000 audit: BPF prog-id=196 op=UNLOAD Dec 16 12:54:30.047000 audit[4138]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffe87d18470 a2=0 a3=7ffe87d18570 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.047000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.048000 audit: BPF prog-id=195 op=UNLOAD Dec 16 12:54:30.048000 audit[4138]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7230d10 a2=0 a3=23bccc954c4570e5 items=0 ppid=4033 pid=4138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.048000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:54:30.051031 containerd[1649]: time="2025-12-16T12:54:30.051004305Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:54:30.058000 audit: BPF prog-id=197 op=LOAD Dec 16 12:54:30.058000 audit[4141]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc80498a40 a2=98 a3=1999999999999999 items=0 ppid=4033 pid=4141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.058000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:54:30.058000 audit: BPF prog-id=197 op=UNLOAD Dec 16 12:54:30.058000 audit[4141]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffc80498a10 a3=0 items=0 ppid=4033 pid=4141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.058000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:54:30.058000 audit: BPF prog-id=198 op=LOAD Dec 16 12:54:30.058000 audit[4141]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc80498920 a2=94 a3=ffff items=0 ppid=4033 pid=4141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.058000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:54:30.058000 audit: BPF prog-id=198 op=UNLOAD Dec 16 12:54:30.058000 audit[4141]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc80498920 a2=94 a3=ffff items=0 ppid=4033 pid=4141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.058000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:54:30.058000 audit: BPF prog-id=199 op=LOAD Dec 16 12:54:30.058000 audit[4141]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc80498960 a2=94 a3=7ffc80498b40 items=0 ppid=4033 pid=4141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.058000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:54:30.058000 audit: BPF prog-id=199 op=UNLOAD Dec 16 12:54:30.058000 audit[4141]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc80498960 a2=94 a3=7ffc80498b40 items=0 ppid=4033 pid=4141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.058000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:54:30.124298 systemd-networkd[1544]: vxlan.calico: Link UP Dec 16 12:54:30.124306 systemd-networkd[1544]: vxlan.calico: Gained carrier Dec 16 12:54:30.149000 audit: BPF prog-id=200 op=LOAD Dec 16 12:54:30.149000 audit[4165]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff110867e0 a2=98 a3=0 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.149000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.149000 audit: BPF prog-id=200 op=UNLOAD Dec 16 12:54:30.149000 audit[4165]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff110867b0 a3=0 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.149000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.150000 audit: BPF prog-id=201 op=LOAD Dec 16 12:54:30.150000 audit[4165]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff110865f0 a2=94 a3=54428f items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.150000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.150000 audit: BPF prog-id=201 op=UNLOAD Dec 16 12:54:30.150000 audit[4165]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff110865f0 a2=94 a3=54428f items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.150000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.150000 audit: BPF prog-id=202 op=LOAD Dec 16 12:54:30.150000 audit[4165]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff11086620 a2=94 a3=2 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.150000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.150000 audit: BPF prog-id=202 op=UNLOAD Dec 16 12:54:30.150000 audit[4165]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff11086620 a2=0 a3=2 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.150000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.150000 audit: BPF prog-id=203 op=LOAD Dec 16 12:54:30.150000 audit[4165]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff110863d0 a2=94 a3=4 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.150000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.151000 audit: BPF prog-id=203 op=UNLOAD Dec 16 12:54:30.151000 audit[4165]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff110863d0 a2=94 a3=4 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.151000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.151000 audit: BPF prog-id=204 op=LOAD Dec 16 12:54:30.151000 audit[4165]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff110864d0 a2=94 a3=7fff11086650 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.151000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.151000 audit: BPF prog-id=204 op=UNLOAD Dec 16 12:54:30.151000 audit[4165]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff110864d0 a2=0 a3=7fff11086650 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.151000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.151000 audit: BPF prog-id=205 op=LOAD Dec 16 12:54:30.151000 audit[4165]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff11085c00 a2=94 a3=2 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.151000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.151000 audit: BPF prog-id=205 op=UNLOAD Dec 16 12:54:30.151000 audit[4165]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff11085c00 a2=0 a3=2 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.151000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.151000 audit: BPF prog-id=206 op=LOAD Dec 16 12:54:30.151000 audit[4165]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff11085d00 a2=94 a3=30 items=0 ppid=4033 pid=4165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.151000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:54:30.163000 audit: BPF prog-id=207 op=LOAD Dec 16 12:54:30.163000 audit[4169]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff467a4cf0 a2=98 a3=0 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.163000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.163000 audit: BPF prog-id=207 op=UNLOAD Dec 16 12:54:30.163000 audit[4169]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff467a4cc0 a3=0 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.163000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.163000 audit: BPF prog-id=208 op=LOAD Dec 16 12:54:30.163000 audit[4169]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff467a4ae0 a2=94 a3=54428f items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.163000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.164000 audit: BPF prog-id=208 op=UNLOAD Dec 16 12:54:30.164000 audit[4169]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff467a4ae0 a2=94 a3=54428f items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.164000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.164000 audit: BPF prog-id=209 op=LOAD Dec 16 12:54:30.164000 audit[4169]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff467a4b10 a2=94 a3=2 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.164000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.164000 audit: BPF prog-id=209 op=UNLOAD Dec 16 12:54:30.164000 audit[4169]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff467a4b10 a2=0 a3=2 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.164000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.293000 audit: BPF prog-id=210 op=LOAD Dec 16 12:54:30.293000 audit[4169]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff467a49d0 a2=94 a3=1 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.293000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.293000 audit: BPF prog-id=210 op=UNLOAD Dec 16 12:54:30.293000 audit[4169]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff467a49d0 a2=94 a3=1 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.293000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.302000 audit: BPF prog-id=211 op=LOAD Dec 16 12:54:30.302000 audit[4169]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff467a49c0 a2=94 a3=4 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.302000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.302000 audit: BPF prog-id=211 op=UNLOAD Dec 16 12:54:30.302000 audit[4169]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff467a49c0 a2=0 a3=4 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.302000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.302000 audit: BPF prog-id=212 op=LOAD Dec 16 12:54:30.302000 audit[4169]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff467a4820 a2=94 a3=5 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.302000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.302000 audit: BPF prog-id=212 op=UNLOAD Dec 16 12:54:30.302000 audit[4169]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff467a4820 a2=0 a3=5 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.302000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.302000 audit: BPF prog-id=213 op=LOAD Dec 16 12:54:30.302000 audit[4169]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff467a4a40 a2=94 a3=6 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.302000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.302000 audit: BPF prog-id=213 op=UNLOAD Dec 16 12:54:30.302000 audit[4169]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff467a4a40 a2=0 a3=6 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.302000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.302000 audit: BPF prog-id=214 op=LOAD Dec 16 12:54:30.302000 audit[4169]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff467a41f0 a2=94 a3=88 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.302000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.302000 audit: BPF prog-id=215 op=LOAD Dec 16 12:54:30.302000 audit[4169]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7fff467a4070 a2=94 a3=2 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.302000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.302000 audit: BPF prog-id=215 op=UNLOAD Dec 16 12:54:30.302000 audit[4169]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7fff467a40a0 a2=0 a3=7fff467a41a0 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.302000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.303000 audit: BPF prog-id=214 op=UNLOAD Dec 16 12:54:30.303000 audit[4169]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=148dd10 a2=0 a3=8d8765b291b2c4b0 items=0 ppid=4033 pid=4169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.303000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:54:30.307000 audit: BPF prog-id=206 op=UNLOAD Dec 16 12:54:30.307000 audit[4033]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c0006d1240 a2=0 a3=0 items=0 ppid=4022 pid=4033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.307000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Dec 16 12:54:30.316111 kubelet[2807]: I1216 12:54:30.316076 2807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="942b94f4-174b-4d9a-b7f3-55c25ca8719b" path="/var/lib/kubelet/pods/942b94f4-174b-4d9a-b7f3-55c25ca8719b/volumes" Dec 16 12:54:30.349000 audit[4191]: NETFILTER_CFG table=raw:119 family=2 entries=21 op=nft_register_chain pid=4191 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:30.349000 audit[4191]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffd116bd800 a2=0 a3=7ffd116bd7ec items=0 ppid=4033 pid=4191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.349000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:30.364000 audit[4192]: NETFILTER_CFG table=mangle:120 family=2 entries=16 op=nft_register_chain pid=4192 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:30.364000 audit[4192]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7fff7eda8310 a2=0 a3=7fff7eda82fc items=0 ppid=4033 pid=4192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.364000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:30.365000 audit[4196]: NETFILTER_CFG table=nat:121 family=2 entries=15 op=nft_register_chain pid=4196 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:30.365000 audit[4196]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffcc59a9b90 a2=0 a3=7ffcc59a9b7c items=0 ppid=4033 pid=4196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.365000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:30.370000 audit[4195]: NETFILTER_CFG table=filter:122 family=2 entries=94 op=nft_register_chain pid=4195 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:30.370000 audit[4195]: SYSCALL arch=c000003e syscall=46 success=yes exit=53116 a0=3 a1=7fffa2c29180 a2=0 a3=7fffa2c2916c items=0 ppid=4033 pid=4195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.370000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:30.479518 containerd[1649]: time="2025-12-16T12:54:30.479396885Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:30.480614 containerd[1649]: time="2025-12-16T12:54:30.480406926Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:54:30.480727 containerd[1649]: time="2025-12-16T12:54:30.480549805Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:30.481024 kubelet[2807]: E1216 12:54:30.480956 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:54:30.481074 kubelet[2807]: E1216 12:54:30.481029 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:54:30.481450 kubelet[2807]: E1216 12:54:30.481185 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nv486,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6746c4dc5c-qzpt8_calico-system(662f30c6-4ed6-44dc-96b4-74080eea2751): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:30.482825 kubelet[2807]: E1216 12:54:30.482788 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:54:30.645620 kubelet[2807]: E1216 12:54:30.645578 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:54:30.667000 audit[4205]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=4205 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:30.667000 audit[4205]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffd79afcd60 a2=0 a3=7ffd79afcd4c items=0 ppid=2949 pid=4205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.667000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:30.671000 audit[4205]: NETFILTER_CFG table=nat:124 family=2 entries=14 op=nft_register_rule pid=4205 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:30.671000 audit[4205]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffd79afcd60 a2=0 a3=0 items=0 ppid=2949 pid=4205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:30.671000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:31.212493 systemd-networkd[1544]: cali90dd5cadb59: Gained IPv6LL Dec 16 12:54:31.788429 systemd-networkd[1544]: vxlan.calico: Gained IPv6LL Dec 16 12:54:32.409749 kubelet[2807]: I1216 12:54:32.409415 2807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:54:34.315676 containerd[1649]: time="2025-12-16T12:54:34.314810525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cn7m7,Uid:f5b0dddf-d30f-46e4-b43e-4240324fba15,Namespace:kube-system,Attempt:0,}" Dec 16 12:54:34.315676 containerd[1649]: time="2025-12-16T12:54:34.315562820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f5b8bcc75-trldf,Uid:4afbd6d6-aac3-4d68-be88-76917639058c,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:54:34.316144 containerd[1649]: time="2025-12-16T12:54:34.315767305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f5b8bcc75-wg5dd,Uid:9345e167-5638-4038-a959-3d55222d2d5c,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:54:34.316144 containerd[1649]: time="2025-12-16T12:54:34.315824071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5fjr9,Uid:7255bf38-4b45-44eb-b0b0-8e800109b0ec,Namespace:kube-system,Attempt:0,}" Dec 16 12:54:34.490690 systemd-networkd[1544]: cali2cfc4da11a8: Link UP Dec 16 12:54:34.493074 systemd-networkd[1544]: cali2cfc4da11a8: Gained carrier Dec 16 12:54:34.525325 containerd[1649]: 2025-12-16 12:54:34.387 [INFO][4264] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0 calico-apiserver-6f5b8bcc75- calico-apiserver 4afbd6d6-aac3-4d68-be88-76917639058c 837 0 2025-12-16 12:54:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f5b8bcc75 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-8-2e3d7ab7bb calico-apiserver-6f5b8bcc75-trldf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2cfc4da11a8 [] [] }} ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-trldf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-" Dec 16 12:54:34.525325 containerd[1649]: 2025-12-16 12:54:34.387 [INFO][4264] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-trldf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" Dec 16 12:54:34.525325 containerd[1649]: 2025-12-16 12:54:34.412 [INFO][4304] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" HandleID="k8s-pod-network.32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" Dec 16 12:54:34.525596 containerd[1649]: 2025-12-16 12:54:34.412 [INFO][4304] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" HandleID="k8s-pod-network.32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5180), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-8-2e3d7ab7bb", "pod":"calico-apiserver-6f5b8bcc75-trldf", "timestamp":"2025-12-16 12:54:34.412500394 +0000 UTC"}, Hostname:"ci-4515-1-0-8-2e3d7ab7bb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:54:34.525596 containerd[1649]: 2025-12-16 12:54:34.412 [INFO][4304] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:54:34.525596 containerd[1649]: 2025-12-16 12:54:34.412 [INFO][4304] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:54:34.525596 containerd[1649]: 2025-12-16 12:54:34.412 [INFO][4304] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-8-2e3d7ab7bb' Dec 16 12:54:34.525596 containerd[1649]: 2025-12-16 12:54:34.422 [INFO][4304] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.525596 containerd[1649]: 2025-12-16 12:54:34.428 [INFO][4304] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.525596 containerd[1649]: 2025-12-16 12:54:34.437 [INFO][4304] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.525596 containerd[1649]: 2025-12-16 12:54:34.441 [INFO][4304] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.525596 containerd[1649]: 2025-12-16 12:54:34.446 [INFO][4304] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.525795 containerd[1649]: 2025-12-16 12:54:34.446 [INFO][4304] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.525795 containerd[1649]: 2025-12-16 12:54:34.448 [INFO][4304] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08 Dec 16 12:54:34.525795 containerd[1649]: 2025-12-16 12:54:34.454 [INFO][4304] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.525795 containerd[1649]: 2025-12-16 12:54:34.463 [INFO][4304] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.2/26] block=192.168.106.0/26 handle="k8s-pod-network.32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.525795 containerd[1649]: 2025-12-16 12:54:34.463 [INFO][4304] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.2/26] handle="k8s-pod-network.32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.525795 containerd[1649]: 2025-12-16 12:54:34.463 [INFO][4304] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:54:34.525795 containerd[1649]: 2025-12-16 12:54:34.464 [INFO][4304] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.2/26] IPv6=[] ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" HandleID="k8s-pod-network.32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" Dec 16 12:54:34.526822 containerd[1649]: 2025-12-16 12:54:34.474 [INFO][4264] cni-plugin/k8s.go 418: Populated endpoint ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-trldf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0", GenerateName:"calico-apiserver-6f5b8bcc75-", Namespace:"calico-apiserver", SelfLink:"", UID:"4afbd6d6-aac3-4d68-be88-76917639058c", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f5b8bcc75", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"", Pod:"calico-apiserver-6f5b8bcc75-trldf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2cfc4da11a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:34.526880 containerd[1649]: 2025-12-16 12:54:34.474 [INFO][4264] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.2/32] ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-trldf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" Dec 16 12:54:34.526880 containerd[1649]: 2025-12-16 12:54:34.477 [INFO][4264] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2cfc4da11a8 ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-trldf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" Dec 16 12:54:34.526880 containerd[1649]: 2025-12-16 12:54:34.498 [INFO][4264] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-trldf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" Dec 16 12:54:34.526931 containerd[1649]: 2025-12-16 12:54:34.500 [INFO][4264] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-trldf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0", GenerateName:"calico-apiserver-6f5b8bcc75-", Namespace:"calico-apiserver", SelfLink:"", UID:"4afbd6d6-aac3-4d68-be88-76917639058c", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f5b8bcc75", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08", Pod:"calico-apiserver-6f5b8bcc75-trldf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2cfc4da11a8", MAC:"2a:e5:8a:11:7f:af", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:34.526978 containerd[1649]: 2025-12-16 12:54:34.523 [INFO][4264] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-trldf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--trldf-eth0" Dec 16 12:54:34.560336 containerd[1649]: time="2025-12-16T12:54:34.560288458Z" level=info msg="connecting to shim 32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08" address="unix:///run/containerd/s/c5a5f621f2069062a95ed5ca13a1f9a106a058570bddf46fce69942ebf5d4d87" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:34.585702 systemd-networkd[1544]: cali61dc65d5646: Link UP Dec 16 12:54:34.586139 systemd-networkd[1544]: cali61dc65d5646: Gained carrier Dec 16 12:54:34.610044 containerd[1649]: 2025-12-16 12:54:34.436 [INFO][4270] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0 calico-apiserver-6f5b8bcc75- calico-apiserver 9345e167-5638-4038-a959-3d55222d2d5c 836 0 2025-12-16 12:54:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f5b8bcc75 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-8-2e3d7ab7bb calico-apiserver-6f5b8bcc75-wg5dd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali61dc65d5646 [] [] }} ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-wg5dd" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-" Dec 16 12:54:34.610044 containerd[1649]: 2025-12-16 12:54:34.436 [INFO][4270] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-wg5dd" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" Dec 16 12:54:34.610044 containerd[1649]: 2025-12-16 12:54:34.508 [INFO][4318] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" HandleID="k8s-pod-network.5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" Dec 16 12:54:34.610275 containerd[1649]: 2025-12-16 12:54:34.510 [INFO][4318] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" HandleID="k8s-pod-network.5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f6e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-8-2e3d7ab7bb", "pod":"calico-apiserver-6f5b8bcc75-wg5dd", "timestamp":"2025-12-16 12:54:34.508794898 +0000 UTC"}, Hostname:"ci-4515-1-0-8-2e3d7ab7bb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:54:34.610275 containerd[1649]: 2025-12-16 12:54:34.510 [INFO][4318] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:54:34.610275 containerd[1649]: 2025-12-16 12:54:34.510 [INFO][4318] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:54:34.610275 containerd[1649]: 2025-12-16 12:54:34.510 [INFO][4318] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-8-2e3d7ab7bb' Dec 16 12:54:34.610275 containerd[1649]: 2025-12-16 12:54:34.525 [INFO][4318] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.610275 containerd[1649]: 2025-12-16 12:54:34.535 [INFO][4318] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.610275 containerd[1649]: 2025-12-16 12:54:34.540 [INFO][4318] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.610275 containerd[1649]: 2025-12-16 12:54:34.544 [INFO][4318] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.610275 containerd[1649]: 2025-12-16 12:54:34.549 [INFO][4318] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.610463 containerd[1649]: 2025-12-16 12:54:34.549 [INFO][4318] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.610463 containerd[1649]: 2025-12-16 12:54:34.552 [INFO][4318] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16 Dec 16 12:54:34.610463 containerd[1649]: 2025-12-16 12:54:34.558 [INFO][4318] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.610463 containerd[1649]: 2025-12-16 12:54:34.568 [INFO][4318] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.3/26] block=192.168.106.0/26 handle="k8s-pod-network.5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.610463 containerd[1649]: 2025-12-16 12:54:34.568 [INFO][4318] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.3/26] handle="k8s-pod-network.5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.610463 containerd[1649]: 2025-12-16 12:54:34.568 [INFO][4318] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:54:34.610463 containerd[1649]: 2025-12-16 12:54:34.568 [INFO][4318] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.3/26] IPv6=[] ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" HandleID="k8s-pod-network.5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" Dec 16 12:54:34.612258 containerd[1649]: 2025-12-16 12:54:34.577 [INFO][4270] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-wg5dd" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0", GenerateName:"calico-apiserver-6f5b8bcc75-", Namespace:"calico-apiserver", SelfLink:"", UID:"9345e167-5638-4038-a959-3d55222d2d5c", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f5b8bcc75", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"", Pod:"calico-apiserver-6f5b8bcc75-wg5dd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali61dc65d5646", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:34.612320 containerd[1649]: 2025-12-16 12:54:34.579 [INFO][4270] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.3/32] ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-wg5dd" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" Dec 16 12:54:34.612320 containerd[1649]: 2025-12-16 12:54:34.581 [INFO][4270] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali61dc65d5646 ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-wg5dd" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" Dec 16 12:54:34.612320 containerd[1649]: 2025-12-16 12:54:34.587 [INFO][4270] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-wg5dd" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" Dec 16 12:54:34.612374 containerd[1649]: 2025-12-16 12:54:34.588 [INFO][4270] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-wg5dd" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0", GenerateName:"calico-apiserver-6f5b8bcc75-", Namespace:"calico-apiserver", SelfLink:"", UID:"9345e167-5638-4038-a959-3d55222d2d5c", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f5b8bcc75", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16", Pod:"calico-apiserver-6f5b8bcc75-wg5dd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali61dc65d5646", MAC:"6a:d9:76:94:d6:9c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:34.612434 containerd[1649]: 2025-12-16 12:54:34.603 [INFO][4270] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" Namespace="calico-apiserver" Pod="calico-apiserver-6f5b8bcc75-wg5dd" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--apiserver--6f5b8bcc75--wg5dd-eth0" Dec 16 12:54:34.616557 systemd[1]: Started cri-containerd-32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08.scope - libcontainer container 32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08. Dec 16 12:54:34.632000 audit[4380]: NETFILTER_CFG table=filter:125 family=2 entries=50 op=nft_register_chain pid=4380 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:34.635486 kernel: kauditd_printk_skb: 237 callbacks suppressed Dec 16 12:54:34.635548 kernel: audit: type=1325 audit(1765889674.632:663): table=filter:125 family=2 entries=50 op=nft_register_chain pid=4380 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:34.632000 audit[4380]: SYSCALL arch=c000003e syscall=46 success=yes exit=28208 a0=3 a1=7ffe58f85f20 a2=0 a3=7ffe58f85f0c items=0 ppid=4033 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.649306 kernel: audit: type=1300 audit(1765889674.632:663): arch=c000003e syscall=46 success=yes exit=28208 a0=3 a1=7ffe58f85f20 a2=0 a3=7ffe58f85f0c items=0 ppid=4033 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.632000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:34.656264 kernel: audit: type=1327 audit(1765889674.632:663): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:34.640000 audit: BPF prog-id=216 op=LOAD Dec 16 12:54:34.662511 kernel: audit: type=1334 audit(1765889674.640:664): prog-id=216 op=LOAD Dec 16 12:54:34.641000 audit: BPF prog-id=217 op=LOAD Dec 16 12:54:34.665567 kernel: audit: type=1334 audit(1765889674.641:665): prog-id=217 op=LOAD Dec 16 12:54:34.672609 kernel: audit: type=1300 audit(1765889674.641:665): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=4355 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.641000 audit[4366]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=4355 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653733633162646133366538353039333963333030633664313936 Dec 16 12:54:34.680508 kernel: audit: type=1327 audit(1765889674.641:665): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653733633162646133366538353039333963333030633664313936 Dec 16 12:54:34.681238 kernel: audit: type=1334 audit(1765889674.641:666): prog-id=217 op=UNLOAD Dec 16 12:54:34.641000 audit: BPF prog-id=217 op=UNLOAD Dec 16 12:54:34.641000 audit[4366]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4355 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653733633162646133366538353039333963333030633664313936 Dec 16 12:54:34.692224 kernel: audit: type=1300 audit(1765889674.641:666): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4355 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.692712 kernel: audit: type=1327 audit(1765889674.641:666): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653733633162646133366538353039333963333030633664313936 Dec 16 12:54:34.641000 audit: BPF prog-id=218 op=LOAD Dec 16 12:54:34.641000 audit[4366]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=4355 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653733633162646133366538353039333963333030633664313936 Dec 16 12:54:34.641000 audit: BPF prog-id=219 op=LOAD Dec 16 12:54:34.641000 audit[4366]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=4355 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653733633162646133366538353039333963333030633664313936 Dec 16 12:54:34.641000 audit: BPF prog-id=219 op=UNLOAD Dec 16 12:54:34.641000 audit[4366]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4355 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653733633162646133366538353039333963333030633664313936 Dec 16 12:54:34.641000 audit: BPF prog-id=218 op=UNLOAD Dec 16 12:54:34.641000 audit[4366]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4355 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653733633162646133366538353039333963333030633664313936 Dec 16 12:54:34.641000 audit: BPF prog-id=220 op=LOAD Dec 16 12:54:34.641000 audit[4366]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=4355 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653733633162646133366538353039333963333030633664313936 Dec 16 12:54:34.709245 systemd-networkd[1544]: calicf56c34825f: Link UP Dec 16 12:54:34.713142 systemd-networkd[1544]: calicf56c34825f: Gained carrier Dec 16 12:54:34.744181 containerd[1649]: time="2025-12-16T12:54:34.743982625Z" level=info msg="connecting to shim 5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16" address="unix:///run/containerd/s/80bfb1d4fa32f0abe6bcaa35b72ff0f5b35e47b26de32dbf2d9a3863d948cf24" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:34.784464 containerd[1649]: 2025-12-16 12:54:34.439 [INFO][4279] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0 coredns-668d6bf9bc- kube-system 7255bf38-4b45-44eb-b0b0-8e800109b0ec 839 0 2025-12-16 12:53:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-8-2e3d7ab7bb coredns-668d6bf9bc-5fjr9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calicf56c34825f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Namespace="kube-system" Pod="coredns-668d6bf9bc-5fjr9" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-" Dec 16 12:54:34.784464 containerd[1649]: 2025-12-16 12:54:34.439 [INFO][4279] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Namespace="kube-system" Pod="coredns-668d6bf9bc-5fjr9" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" Dec 16 12:54:34.784464 containerd[1649]: 2025-12-16 12:54:34.513 [INFO][4323] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" HandleID="k8s-pod-network.e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" Dec 16 12:54:34.784777 containerd[1649]: 2025-12-16 12:54:34.518 [INFO][4323] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" HandleID="k8s-pod-network.e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf8b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-8-2e3d7ab7bb", "pod":"coredns-668d6bf9bc-5fjr9", "timestamp":"2025-12-16 12:54:34.513042585 +0000 UTC"}, Hostname:"ci-4515-1-0-8-2e3d7ab7bb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:54:34.784777 containerd[1649]: 2025-12-16 12:54:34.519 [INFO][4323] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:54:34.784777 containerd[1649]: 2025-12-16 12:54:34.568 [INFO][4323] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:54:34.784777 containerd[1649]: 2025-12-16 12:54:34.568 [INFO][4323] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-8-2e3d7ab7bb' Dec 16 12:54:34.784777 containerd[1649]: 2025-12-16 12:54:34.625 [INFO][4323] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.784777 containerd[1649]: 2025-12-16 12:54:34.636 [INFO][4323] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.784777 containerd[1649]: 2025-12-16 12:54:34.650 [INFO][4323] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.784777 containerd[1649]: 2025-12-16 12:54:34.657 [INFO][4323] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.784777 containerd[1649]: 2025-12-16 12:54:34.660 [INFO][4323] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.784964 containerd[1649]: 2025-12-16 12:54:34.660 [INFO][4323] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.784964 containerd[1649]: 2025-12-16 12:54:34.663 [INFO][4323] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5 Dec 16 12:54:34.784964 containerd[1649]: 2025-12-16 12:54:34.675 [INFO][4323] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.784964 containerd[1649]: 2025-12-16 12:54:34.685 [INFO][4323] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.4/26] block=192.168.106.0/26 handle="k8s-pod-network.e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.784964 containerd[1649]: 2025-12-16 12:54:34.685 [INFO][4323] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.4/26] handle="k8s-pod-network.e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.784964 containerd[1649]: 2025-12-16 12:54:34.685 [INFO][4323] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:54:34.784964 containerd[1649]: 2025-12-16 12:54:34.685 [INFO][4323] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.4/26] IPv6=[] ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" HandleID="k8s-pod-network.e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" Dec 16 12:54:34.785595 containerd[1649]: 2025-12-16 12:54:34.699 [INFO][4279] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Namespace="kube-system" Pod="coredns-668d6bf9bc-5fjr9" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7255bf38-4b45-44eb-b0b0-8e800109b0ec", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 53, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"", Pod:"coredns-668d6bf9bc-5fjr9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicf56c34825f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:34.785595 containerd[1649]: 2025-12-16 12:54:34.699 [INFO][4279] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.4/32] ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Namespace="kube-system" Pod="coredns-668d6bf9bc-5fjr9" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" Dec 16 12:54:34.785595 containerd[1649]: 2025-12-16 12:54:34.699 [INFO][4279] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicf56c34825f ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Namespace="kube-system" Pod="coredns-668d6bf9bc-5fjr9" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" Dec 16 12:54:34.785595 containerd[1649]: 2025-12-16 12:54:34.716 [INFO][4279] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Namespace="kube-system" Pod="coredns-668d6bf9bc-5fjr9" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" Dec 16 12:54:34.785595 containerd[1649]: 2025-12-16 12:54:34.719 [INFO][4279] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Namespace="kube-system" Pod="coredns-668d6bf9bc-5fjr9" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7255bf38-4b45-44eb-b0b0-8e800109b0ec", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 53, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5", Pod:"coredns-668d6bf9bc-5fjr9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicf56c34825f", MAC:"86:2e:17:96:19:a2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:34.785595 containerd[1649]: 2025-12-16 12:54:34.767 [INFO][4279] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" Namespace="kube-system" Pod="coredns-668d6bf9bc-5fjr9" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--5fjr9-eth0" Dec 16 12:54:34.802668 containerd[1649]: time="2025-12-16T12:54:34.802637707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f5b8bcc75-trldf,Uid:4afbd6d6-aac3-4d68-be88-76917639058c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"32e73c1bda36e850939c300c6d196051bee9df11b1d4b4dd9336177ede944c08\"" Dec 16 12:54:34.807331 containerd[1649]: time="2025-12-16T12:54:34.807213991Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:54:34.812000 audit[4410]: NETFILTER_CFG table=filter:126 family=2 entries=41 op=nft_register_chain pid=4410 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:34.812000 audit[4410]: SYSCALL arch=c000003e syscall=46 success=yes exit=23076 a0=3 a1=7ffcbd2837b0 a2=0 a3=7ffcbd28379c items=0 ppid=4033 pid=4410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.812000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:34.825343 containerd[1649]: time="2025-12-16T12:54:34.825083195Z" level=info msg="connecting to shim e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5" address="unix:///run/containerd/s/1ecd8568487dcefc9c37d5e7e798a396cbeb92fdf5aa32d3868948f78062ec5a" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:34.845410 systemd[1]: Started cri-containerd-5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16.scope - libcontainer container 5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16. Dec 16 12:54:34.846924 systemd-networkd[1544]: cali793bdaf5102: Link UP Dec 16 12:54:34.850603 systemd-networkd[1544]: cali793bdaf5102: Gained carrier Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.431 [INFO][4258] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0 coredns-668d6bf9bc- kube-system f5b0dddf-d30f-46e4-b43e-4240324fba15 834 0 2025-12-16 12:53:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-8-2e3d7ab7bb coredns-668d6bf9bc-cn7m7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali793bdaf5102 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-cn7m7" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.431 [INFO][4258] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-cn7m7" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.519 [INFO][4316] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" HandleID="k8s-pod-network.cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.519 [INFO][4316] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" HandleID="k8s-pod-network.cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e960), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-8-2e3d7ab7bb", "pod":"coredns-668d6bf9bc-cn7m7", "timestamp":"2025-12-16 12:54:34.517744114 +0000 UTC"}, Hostname:"ci-4515-1-0-8-2e3d7ab7bb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.519 [INFO][4316] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.687 [INFO][4316] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.687 [INFO][4316] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-8-2e3d7ab7bb' Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.728 [INFO][4316] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.759 [INFO][4316] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.775 [INFO][4316] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.777 [INFO][4316] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.779 [INFO][4316] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.779 [INFO][4316] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.783 [INFO][4316] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.799 [INFO][4316] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.820 [INFO][4316] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.5/26] block=192.168.106.0/26 handle="k8s-pod-network.cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.821 [INFO][4316] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.5/26] handle="k8s-pod-network.cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.821 [INFO][4316] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:54:34.875798 containerd[1649]: 2025-12-16 12:54:34.821 [INFO][4316] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.5/26] IPv6=[] ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" HandleID="k8s-pod-network.cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" Dec 16 12:54:34.877067 containerd[1649]: 2025-12-16 12:54:34.835 [INFO][4258] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-cn7m7" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f5b0dddf-d30f-46e4-b43e-4240324fba15", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 53, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"", Pod:"coredns-668d6bf9bc-cn7m7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali793bdaf5102", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:34.877067 containerd[1649]: 2025-12-16 12:54:34.835 [INFO][4258] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.5/32] ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-cn7m7" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" Dec 16 12:54:34.877067 containerd[1649]: 2025-12-16 12:54:34.836 [INFO][4258] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali793bdaf5102 ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-cn7m7" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" Dec 16 12:54:34.877067 containerd[1649]: 2025-12-16 12:54:34.852 [INFO][4258] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-cn7m7" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" Dec 16 12:54:34.877067 containerd[1649]: 2025-12-16 12:54:34.854 [INFO][4258] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-cn7m7" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f5b0dddf-d30f-46e4-b43e-4240324fba15", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 53, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c", Pod:"coredns-668d6bf9bc-cn7m7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali793bdaf5102", MAC:"3e:5f:c7:ab:11:e8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:34.877067 containerd[1649]: 2025-12-16 12:54:34.870 [INFO][4258] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-cn7m7" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-coredns--668d6bf9bc--cn7m7-eth0" Dec 16 12:54:34.891306 systemd[1]: Started cri-containerd-e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5.scope - libcontainer container e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5. Dec 16 12:54:34.892000 audit: BPF prog-id=221 op=LOAD Dec 16 12:54:34.893000 audit: BPF prog-id=222 op=LOAD Dec 16 12:54:34.893000 audit[4427]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=4405 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566383237366464613665636461333264333463653332323861346365 Dec 16 12:54:34.893000 audit: BPF prog-id=222 op=UNLOAD Dec 16 12:54:34.893000 audit[4427]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4405 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566383237366464613665636461333264333463653332323861346365 Dec 16 12:54:34.893000 audit: BPF prog-id=223 op=LOAD Dec 16 12:54:34.893000 audit[4427]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=4405 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566383237366464613665636461333264333463653332323861346365 Dec 16 12:54:34.894000 audit: BPF prog-id=224 op=LOAD Dec 16 12:54:34.894000 audit[4427]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=4405 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.894000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566383237366464613665636461333264333463653332323861346365 Dec 16 12:54:34.894000 audit: BPF prog-id=224 op=UNLOAD Dec 16 12:54:34.894000 audit[4427]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4405 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.894000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566383237366464613665636461333264333463653332323861346365 Dec 16 12:54:34.894000 audit: BPF prog-id=223 op=UNLOAD Dec 16 12:54:34.894000 audit[4427]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4405 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.894000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566383237366464613665636461333264333463653332323861346365 Dec 16 12:54:34.894000 audit: BPF prog-id=225 op=LOAD Dec 16 12:54:34.894000 audit[4427]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=4405 pid=4427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.894000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566383237366464613665636461333264333463653332323861346365 Dec 16 12:54:34.904000 audit: BPF prog-id=226 op=LOAD Dec 16 12:54:34.904000 audit: BPF prog-id=227 op=LOAD Dec 16 12:54:34.904000 audit[4460]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000174238 a2=98 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.904000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530656463316231316136646232306235346561376238356463393830 Dec 16 12:54:34.905000 audit: BPF prog-id=227 op=UNLOAD Dec 16 12:54:34.905000 audit[4460]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530656463316231316136646232306235346561376238356463393830 Dec 16 12:54:34.905000 audit: BPF prog-id=228 op=LOAD Dec 16 12:54:34.905000 audit[4460]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000174488 a2=98 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530656463316231316136646232306235346561376238356463393830 Dec 16 12:54:34.905000 audit: BPF prog-id=229 op=LOAD Dec 16 12:54:34.905000 audit[4460]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000174218 a2=98 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530656463316231316136646232306235346561376238356463393830 Dec 16 12:54:34.906000 audit: BPF prog-id=229 op=UNLOAD Dec 16 12:54:34.906000 audit[4460]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.906000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530656463316231316136646232306235346561376238356463393830 Dec 16 12:54:34.906000 audit: BPF prog-id=228 op=UNLOAD Dec 16 12:54:34.906000 audit[4460]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.906000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530656463316231316136646232306235346561376238356463393830 Dec 16 12:54:34.906000 audit: BPF prog-id=230 op=LOAD Dec 16 12:54:34.906000 audit[4460]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001746e8 a2=98 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.906000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530656463316231316136646232306235346561376238356463393830 Dec 16 12:54:34.915099 containerd[1649]: time="2025-12-16T12:54:34.914899320Z" level=info msg="connecting to shim cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c" address="unix:///run/containerd/s/7aae8768edac350f1d17069e41739452097fa1aa523310de5423b9148896c5fe" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:34.950436 systemd[1]: Started cri-containerd-cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c.scope - libcontainer container cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c. Dec 16 12:54:34.964000 audit: BPF prog-id=231 op=LOAD Dec 16 12:54:34.965000 audit: BPF prog-id=232 op=LOAD Dec 16 12:54:34.965000 audit[4514]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4501 pid=4514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.965000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364303661643731383735316663343131306433643831343463653832 Dec 16 12:54:34.967000 audit: BPF prog-id=232 op=UNLOAD Dec 16 12:54:34.967000 audit[4514]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4501 pid=4514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.967000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364303661643731383735316663343131306433643831343463653832 Dec 16 12:54:34.969000 audit: BPF prog-id=233 op=LOAD Dec 16 12:54:34.969000 audit[4514]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4501 pid=4514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.969000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364303661643731383735316663343131306433643831343463653832 Dec 16 12:54:34.971000 audit: BPF prog-id=234 op=LOAD Dec 16 12:54:34.971000 audit[4514]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4501 pid=4514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.971000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364303661643731383735316663343131306433643831343463653832 Dec 16 12:54:34.971000 audit: BPF prog-id=234 op=UNLOAD Dec 16 12:54:34.971000 audit[4514]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4501 pid=4514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.971000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364303661643731383735316663343131306433643831343463653832 Dec 16 12:54:34.972000 audit: BPF prog-id=233 op=UNLOAD Dec 16 12:54:34.972000 audit[4514]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4501 pid=4514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.972000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364303661643731383735316663343131306433643831343463653832 Dec 16 12:54:34.972000 audit: BPF prog-id=235 op=LOAD Dec 16 12:54:34.972000 audit[4514]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4501 pid=4514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:34.972000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364303661643731383735316663343131306433643831343463653832 Dec 16 12:54:34.979143 containerd[1649]: time="2025-12-16T12:54:34.979104797Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5fjr9,Uid:7255bf38-4b45-44eb-b0b0-8e800109b0ec,Namespace:kube-system,Attempt:0,} returns sandbox id \"e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5\"" Dec 16 12:54:34.988521 containerd[1649]: time="2025-12-16T12:54:34.988316555Z" level=info msg="CreateContainer within sandbox \"e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:54:35.004928 containerd[1649]: time="2025-12-16T12:54:35.004892387Z" level=info msg="Container d9043d919562abb75af55c290d2ac98b141799bcbb94c41cfe7131e21400f4f8: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:54:35.011030 containerd[1649]: time="2025-12-16T12:54:35.010927542Z" level=info msg="CreateContainer within sandbox \"e0edc1b11a6db20b54ea7b85dc9808976ece0bcd2289819f1ec9870e2427aaf5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d9043d919562abb75af55c290d2ac98b141799bcbb94c41cfe7131e21400f4f8\"" Dec 16 12:54:35.012538 containerd[1649]: time="2025-12-16T12:54:35.012433943Z" level=info msg="StartContainer for \"d9043d919562abb75af55c290d2ac98b141799bcbb94c41cfe7131e21400f4f8\"" Dec 16 12:54:35.014870 containerd[1649]: time="2025-12-16T12:54:35.014827282Z" level=info msg="connecting to shim d9043d919562abb75af55c290d2ac98b141799bcbb94c41cfe7131e21400f4f8" address="unix:///run/containerd/s/1ecd8568487dcefc9c37d5e7e798a396cbeb92fdf5aa32d3868948f78062ec5a" protocol=ttrpc version=3 Dec 16 12:54:35.039000 audit[4542]: NETFILTER_CFG table=filter:127 family=2 entries=50 op=nft_register_chain pid=4542 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:35.039000 audit[4542]: SYSCALL arch=c000003e syscall=46 success=yes exit=24928 a0=3 a1=7ffe5c3d3a50 a2=0 a3=7ffe5c3d3a3c items=0 ppid=4033 pid=4542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.039000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:35.049399 systemd[1]: Started cri-containerd-d9043d919562abb75af55c290d2ac98b141799bcbb94c41cfe7131e21400f4f8.scope - libcontainer container d9043d919562abb75af55c290d2ac98b141799bcbb94c41cfe7131e21400f4f8. Dec 16 12:54:35.059571 containerd[1649]: time="2025-12-16T12:54:35.059517989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cn7m7,Uid:f5b0dddf-d30f-46e4-b43e-4240324fba15,Namespace:kube-system,Attempt:0,} returns sandbox id \"cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c\"" Dec 16 12:54:35.063961 containerd[1649]: time="2025-12-16T12:54:35.063941564Z" level=info msg="CreateContainer within sandbox \"cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:54:35.072591 containerd[1649]: time="2025-12-16T12:54:35.072282292Z" level=info msg="Container 34670b1eb2a4104abf0b933bb5f8608b34166d1b105e6663d5d878f60513446e: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:54:35.075992 containerd[1649]: time="2025-12-16T12:54:35.075973672Z" level=info msg="CreateContainer within sandbox \"cd06ad718751fc4110d3d8144ce82c0bc4bb78312e754e2dcafb9c7643c2be1c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"34670b1eb2a4104abf0b933bb5f8608b34166d1b105e6663d5d878f60513446e\"" Dec 16 12:54:35.077276 containerd[1649]: time="2025-12-16T12:54:35.077260720Z" level=info msg="StartContainer for \"34670b1eb2a4104abf0b933bb5f8608b34166d1b105e6663d5d878f60513446e\"" Dec 16 12:54:35.078015 containerd[1649]: time="2025-12-16T12:54:35.077978399Z" level=info msg="connecting to shim 34670b1eb2a4104abf0b933bb5f8608b34166d1b105e6663d5d878f60513446e" address="unix:///run/containerd/s/7aae8768edac350f1d17069e41739452097fa1aa523310de5423b9148896c5fe" protocol=ttrpc version=3 Dec 16 12:54:35.082000 audit: BPF prog-id=236 op=LOAD Dec 16 12:54:35.083000 audit: BPF prog-id=237 op=LOAD Dec 16 12:54:35.083000 audit[4543]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4447 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439303433643931393536326162623735616635356332393064326163 Dec 16 12:54:35.084000 audit: BPF prog-id=237 op=UNLOAD Dec 16 12:54:35.084000 audit[4543]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4447 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439303433643931393536326162623735616635356332393064326163 Dec 16 12:54:35.084000 audit: BPF prog-id=238 op=LOAD Dec 16 12:54:35.084000 audit[4543]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4447 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439303433643931393536326162623735616635356332393064326163 Dec 16 12:54:35.084000 audit: BPF prog-id=239 op=LOAD Dec 16 12:54:35.084000 audit[4543]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4447 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439303433643931393536326162623735616635356332393064326163 Dec 16 12:54:35.084000 audit: BPF prog-id=239 op=UNLOAD Dec 16 12:54:35.084000 audit[4543]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4447 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439303433643931393536326162623735616635356332393064326163 Dec 16 12:54:35.085000 audit: BPF prog-id=238 op=UNLOAD Dec 16 12:54:35.085000 audit[4543]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4447 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.085000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439303433643931393536326162623735616635356332393064326163 Dec 16 12:54:35.085000 audit: BPF prog-id=240 op=LOAD Dec 16 12:54:35.085000 audit[4543]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4447 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.085000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439303433643931393536326162623735616635356332393064326163 Dec 16 12:54:35.101307 systemd[1]: Started cri-containerd-34670b1eb2a4104abf0b933bb5f8608b34166d1b105e6663d5d878f60513446e.scope - libcontainer container 34670b1eb2a4104abf0b933bb5f8608b34166d1b105e6663d5d878f60513446e. Dec 16 12:54:35.120000 audit: BPF prog-id=241 op=LOAD Dec 16 12:54:35.120000 audit: BPF prog-id=242 op=LOAD Dec 16 12:54:35.120000 audit[4569]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4501 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334363730623165623261343130346162663062393333626235663836 Dec 16 12:54:35.120000 audit: BPF prog-id=242 op=UNLOAD Dec 16 12:54:35.120000 audit[4569]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4501 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334363730623165623261343130346162663062393333626235663836 Dec 16 12:54:35.120000 audit: BPF prog-id=243 op=LOAD Dec 16 12:54:35.120000 audit[4569]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4501 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334363730623165623261343130346162663062393333626235663836 Dec 16 12:54:35.121000 audit: BPF prog-id=244 op=LOAD Dec 16 12:54:35.121000 audit[4569]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4501 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.121000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334363730623165623261343130346162663062393333626235663836 Dec 16 12:54:35.121000 audit: BPF prog-id=244 op=UNLOAD Dec 16 12:54:35.121000 audit[4569]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4501 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.121000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334363730623165623261343130346162663062393333626235663836 Dec 16 12:54:35.121000 audit: BPF prog-id=243 op=UNLOAD Dec 16 12:54:35.121000 audit[4569]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4501 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.121000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334363730623165623261343130346162663062393333626235663836 Dec 16 12:54:35.121000 audit: BPF prog-id=245 op=LOAD Dec 16 12:54:35.121000 audit[4569]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4501 pid=4569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.121000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334363730623165623261343130346162663062393333626235663836 Dec 16 12:54:35.135753 containerd[1649]: time="2025-12-16T12:54:35.135707274Z" level=info msg="StartContainer for \"d9043d919562abb75af55c290d2ac98b141799bcbb94c41cfe7131e21400f4f8\" returns successfully" Dec 16 12:54:35.145898 containerd[1649]: time="2025-12-16T12:54:35.145853436Z" level=info msg="StartContainer for \"34670b1eb2a4104abf0b933bb5f8608b34166d1b105e6663d5d878f60513446e\" returns successfully" Dec 16 12:54:35.192082 containerd[1649]: time="2025-12-16T12:54:35.192027401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f5b8bcc75-wg5dd,Uid:9345e167-5638-4038-a959-3d55222d2d5c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5f8276dda6ecda32d34ce3228a4ce639e8eb70a2949808acee99f2babb636e16\"" Dec 16 12:54:35.222000 audit[4619]: NETFILTER_CFG table=filter:128 family=2 entries=44 op=nft_register_chain pid=4619 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:35.222000 audit[4619]: SYSCALL arch=c000003e syscall=46 success=yes exit=21532 a0=3 a1=7fff50344e20 a2=0 a3=7fff50344e0c items=0 ppid=4033 pid=4619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.222000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:35.240346 containerd[1649]: time="2025-12-16T12:54:35.240290202Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:35.241937 containerd[1649]: time="2025-12-16T12:54:35.241881393Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:54:35.242051 containerd[1649]: time="2025-12-16T12:54:35.241945964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:35.242585 kubelet[2807]: E1216 12:54:35.242534 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:54:35.242932 kubelet[2807]: E1216 12:54:35.242600 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:54:35.242932 kubelet[2807]: E1216 12:54:35.242796 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4bft4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f5b8bcc75-trldf_calico-apiserver(4afbd6d6-aac3-4d68-be88-76917639058c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:35.243491 containerd[1649]: time="2025-12-16T12:54:35.243464138Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:54:35.254235 kubelet[2807]: E1216 12:54:35.254191 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:54:35.347945 containerd[1649]: time="2025-12-16T12:54:35.347896663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cc84b5-xz9fx,Uid:0318a864-5985-4f05-83eb-6e5fed8acf7e,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:35.495658 systemd-networkd[1544]: cali4aac4ef7538: Link UP Dec 16 12:54:35.497361 systemd-networkd[1544]: cali4aac4ef7538: Gained carrier Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.418 [INFO][4630] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0 calico-kube-controllers-7cc84b5- calico-system 0318a864-5985-4f05-83eb-6e5fed8acf7e 835 0 2025-12-16 12:54:14 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7cc84b5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4515-1-0-8-2e3d7ab7bb calico-kube-controllers-7cc84b5-xz9fx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4aac4ef7538 [] [] }} ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Namespace="calico-system" Pod="calico-kube-controllers-7cc84b5-xz9fx" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.419 [INFO][4630] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Namespace="calico-system" Pod="calico-kube-controllers-7cc84b5-xz9fx" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.449 [INFO][4641] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" HandleID="k8s-pod-network.264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.451 [INFO][4641] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" HandleID="k8s-pod-network.264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-8-2e3d7ab7bb", "pod":"calico-kube-controllers-7cc84b5-xz9fx", "timestamp":"2025-12-16 12:54:35.449752405 +0000 UTC"}, Hostname:"ci-4515-1-0-8-2e3d7ab7bb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.451 [INFO][4641] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.451 [INFO][4641] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.451 [INFO][4641] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-8-2e3d7ab7bb' Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.459 [INFO][4641] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.464 [INFO][4641] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.468 [INFO][4641] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.470 [INFO][4641] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.472 [INFO][4641] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.472 [INFO][4641] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.473 [INFO][4641] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.480 [INFO][4641] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.487 [INFO][4641] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.6/26] block=192.168.106.0/26 handle="k8s-pod-network.264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.487 [INFO][4641] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.6/26] handle="k8s-pod-network.264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.487 [INFO][4641] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:54:35.522095 containerd[1649]: 2025-12-16 12:54:35.488 [INFO][4641] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.6/26] IPv6=[] ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" HandleID="k8s-pod-network.264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" Dec 16 12:54:35.524091 containerd[1649]: 2025-12-16 12:54:35.491 [INFO][4630] cni-plugin/k8s.go 418: Populated endpoint ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Namespace="calico-system" Pod="calico-kube-controllers-7cc84b5-xz9fx" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0", GenerateName:"calico-kube-controllers-7cc84b5-", Namespace:"calico-system", SelfLink:"", UID:"0318a864-5985-4f05-83eb-6e5fed8acf7e", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cc84b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"", Pod:"calico-kube-controllers-7cc84b5-xz9fx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.106.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4aac4ef7538", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:35.524091 containerd[1649]: 2025-12-16 12:54:35.491 [INFO][4630] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.6/32] ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Namespace="calico-system" Pod="calico-kube-controllers-7cc84b5-xz9fx" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" Dec 16 12:54:35.524091 containerd[1649]: 2025-12-16 12:54:35.491 [INFO][4630] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4aac4ef7538 ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Namespace="calico-system" Pod="calico-kube-controllers-7cc84b5-xz9fx" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" Dec 16 12:54:35.524091 containerd[1649]: 2025-12-16 12:54:35.497 [INFO][4630] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Namespace="calico-system" Pod="calico-kube-controllers-7cc84b5-xz9fx" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" Dec 16 12:54:35.524091 containerd[1649]: 2025-12-16 12:54:35.497 [INFO][4630] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Namespace="calico-system" Pod="calico-kube-controllers-7cc84b5-xz9fx" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0", GenerateName:"calico-kube-controllers-7cc84b5-", Namespace:"calico-system", SelfLink:"", UID:"0318a864-5985-4f05-83eb-6e5fed8acf7e", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cc84b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c", Pod:"calico-kube-controllers-7cc84b5-xz9fx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.106.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4aac4ef7538", MAC:"aa:6a:e7:98:d4:81", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:35.524091 containerd[1649]: 2025-12-16 12:54:35.518 [INFO][4630] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" Namespace="calico-system" Pod="calico-kube-controllers-7cc84b5-xz9fx" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-calico--kube--controllers--7cc84b5--xz9fx-eth0" Dec 16 12:54:35.547000 audit[4664]: NETFILTER_CFG table=filter:129 family=2 entries=52 op=nft_register_chain pid=4664 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:35.547000 audit[4664]: SYSCALL arch=c000003e syscall=46 success=yes exit=24328 a0=3 a1=7fff71df8a10 a2=0 a3=7fff71df89fc items=0 ppid=4033 pid=4664 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.547000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:35.556547 containerd[1649]: time="2025-12-16T12:54:35.556515981Z" level=info msg="connecting to shim 264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c" address="unix:///run/containerd/s/40ea4d3173392d52ef11330917d2fb9aadc28504815d5f7b2eba263f29ed50de" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:35.564735 systemd-networkd[1544]: cali2cfc4da11a8: Gained IPv6LL Dec 16 12:54:35.604560 systemd[1]: Started cri-containerd-264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c.scope - libcontainer container 264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c. Dec 16 12:54:35.630000 audit: BPF prog-id=246 op=LOAD Dec 16 12:54:35.631000 audit: BPF prog-id=247 op=LOAD Dec 16 12:54:35.631000 audit[4682]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4669 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.631000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236346534363036396565393834663363633961383935623264646162 Dec 16 12:54:35.631000 audit: BPF prog-id=247 op=UNLOAD Dec 16 12:54:35.631000 audit[4682]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4669 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.631000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236346534363036396565393834663363633961383935623264646162 Dec 16 12:54:35.633000 audit: BPF prog-id=248 op=LOAD Dec 16 12:54:35.633000 audit[4682]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4669 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.633000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236346534363036396565393834663363633961383935623264646162 Dec 16 12:54:35.633000 audit: BPF prog-id=249 op=LOAD Dec 16 12:54:35.633000 audit[4682]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4669 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.633000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236346534363036396565393834663363633961383935623264646162 Dec 16 12:54:35.633000 audit: BPF prog-id=249 op=UNLOAD Dec 16 12:54:35.633000 audit[4682]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4669 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.633000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236346534363036396565393834663363633961383935623264646162 Dec 16 12:54:35.633000 audit: BPF prog-id=248 op=UNLOAD Dec 16 12:54:35.633000 audit[4682]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4669 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.633000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236346534363036396565393834663363633961383935623264646162 Dec 16 12:54:35.633000 audit: BPF prog-id=250 op=LOAD Dec 16 12:54:35.633000 audit[4682]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4669 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.633000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236346534363036396565393834663363633961383935623264646162 Dec 16 12:54:35.676674 containerd[1649]: time="2025-12-16T12:54:35.676541798Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:35.679056 containerd[1649]: time="2025-12-16T12:54:35.678965935Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:54:35.679056 containerd[1649]: time="2025-12-16T12:54:35.679035716Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:35.680193 kubelet[2807]: E1216 12:54:35.679250 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:54:35.680193 kubelet[2807]: E1216 12:54:35.679340 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:54:35.680398 kubelet[2807]: E1216 12:54:35.680320 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn857,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f5b8bcc75-wg5dd_calico-apiserver(9345e167-5638-4038-a959-3d55222d2d5c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:35.681824 kubelet[2807]: E1216 12:54:35.681789 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:54:35.688799 containerd[1649]: time="2025-12-16T12:54:35.688731481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cc84b5-xz9fx,Uid:0318a864-5985-4f05-83eb-6e5fed8acf7e,Namespace:calico-system,Attempt:0,} returns sandbox id \"264e46069ee984f3cc9a895b2ddabfccaba4cf0c6ff05bc3d96fcafca9a3d21c\"" Dec 16 12:54:35.692627 containerd[1649]: time="2025-12-16T12:54:35.692607487Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:54:35.750351 kubelet[2807]: E1216 12:54:35.749234 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:54:35.752482 kubelet[2807]: E1216 12:54:35.752459 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:54:35.834450 kubelet[2807]: I1216 12:54:35.832990 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-cn7m7" podStartSLOduration=38.832957247 podStartE2EDuration="38.832957247s" podCreationTimestamp="2025-12-16 12:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:54:35.809550939 +0000 UTC m=+43.595204942" watchObservedRunningTime="2025-12-16 12:54:35.832957247 +0000 UTC m=+43.618611250" Dec 16 12:54:35.898684 kubelet[2807]: I1216 12:54:35.898633 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-5fjr9" podStartSLOduration=38.898617902 podStartE2EDuration="38.898617902s" podCreationTimestamp="2025-12-16 12:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:54:35.865224782 +0000 UTC m=+43.650878785" watchObservedRunningTime="2025-12-16 12:54:35.898617902 +0000 UTC m=+43.684271904" Dec 16 12:54:35.937000 audit[4710]: NETFILTER_CFG table=filter:130 family=2 entries=20 op=nft_register_rule pid=4710 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:35.937000 audit[4710]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcbc015620 a2=0 a3=7ffcbc01560c items=0 ppid=2949 pid=4710 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.937000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:35.942000 audit[4710]: NETFILTER_CFG table=nat:131 family=2 entries=14 op=nft_register_rule pid=4710 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:35.942000 audit[4710]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffcbc015620 a2=0 a3=0 items=0 ppid=2949 pid=4710 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.942000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:35.968000 audit[4712]: NETFILTER_CFG table=filter:132 family=2 entries=17 op=nft_register_rule pid=4712 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:35.968000 audit[4712]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe9e3d4620 a2=0 a3=7ffe9e3d460c items=0 ppid=2949 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.968000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:35.980000 audit[4712]: NETFILTER_CFG table=nat:133 family=2 entries=47 op=nft_register_chain pid=4712 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:35.980000 audit[4712]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffe9e3d4620 a2=0 a3=7ffe9e3d460c items=0 ppid=2949 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:35.980000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:36.123823 containerd[1649]: time="2025-12-16T12:54:36.123005658Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:36.124682 containerd[1649]: time="2025-12-16T12:54:36.124603822Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:54:36.125088 containerd[1649]: time="2025-12-16T12:54:36.124651832Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:36.125525 kubelet[2807]: E1216 12:54:36.125466 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:54:36.125717 kubelet[2807]: E1216 12:54:36.125605 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:54:36.125899 kubelet[2807]: E1216 12:54:36.125740 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pk96r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7cc84b5-xz9fx_calico-system(0318a864-5985-4f05-83eb-6e5fed8acf7e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:36.127001 kubelet[2807]: E1216 12:54:36.126953 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:54:36.141324 systemd-networkd[1544]: cali61dc65d5646: Gained IPv6LL Dec 16 12:54:36.332332 systemd-networkd[1544]: calicf56c34825f: Gained IPv6LL Dec 16 12:54:36.716315 systemd-networkd[1544]: cali793bdaf5102: Gained IPv6LL Dec 16 12:54:36.783005 kubelet[2807]: E1216 12:54:36.782936 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:54:36.783726 kubelet[2807]: E1216 12:54:36.783259 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:54:36.783726 kubelet[2807]: E1216 12:54:36.783305 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:54:36.972435 systemd-networkd[1544]: cali4aac4ef7538: Gained IPv6LL Dec 16 12:54:38.320610 containerd[1649]: time="2025-12-16T12:54:38.320559178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xdkpf,Uid:b5e01eba-2e7b-44aa-9650-696a129f0a90,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:38.320930 containerd[1649]: time="2025-12-16T12:54:38.320796353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9bvxr,Uid:83624a12-e59b-4753-81b9-815a3846bf01,Namespace:calico-system,Attempt:0,}" Dec 16 12:54:38.458622 systemd-networkd[1544]: cali31ff52d25c3: Link UP Dec 16 12:54:38.458773 systemd-networkd[1544]: cali31ff52d25c3: Gained carrier Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.369 [INFO][4714] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0 csi-node-driver- calico-system b5e01eba-2e7b-44aa-9650-696a129f0a90 732 0 2025-12-16 12:54:14 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4515-1-0-8-2e3d7ab7bb csi-node-driver-xdkpf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali31ff52d25c3 [] [] }} ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Namespace="calico-system" Pod="csi-node-driver-xdkpf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.369 [INFO][4714] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Namespace="calico-system" Pod="csi-node-driver-xdkpf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.411 [INFO][4737] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" HandleID="k8s-pod-network.5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.412 [INFO][4737] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" HandleID="k8s-pod-network.5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-8-2e3d7ab7bb", "pod":"csi-node-driver-xdkpf", "timestamp":"2025-12-16 12:54:38.411658563 +0000 UTC"}, Hostname:"ci-4515-1-0-8-2e3d7ab7bb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.412 [INFO][4737] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.412 [INFO][4737] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.412 [INFO][4737] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-8-2e3d7ab7bb' Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.418 [INFO][4737] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.422 [INFO][4737] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.425 [INFO][4737] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.426 [INFO][4737] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.429 [INFO][4737] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.429 [INFO][4737] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.431 [INFO][4737] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2 Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.435 [INFO][4737] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.444 [INFO][4737] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.7/26] block=192.168.106.0/26 handle="k8s-pod-network.5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.444 [INFO][4737] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.7/26] handle="k8s-pod-network.5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.444 [INFO][4737] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:54:38.474495 containerd[1649]: 2025-12-16 12:54:38.444 [INFO][4737] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.7/26] IPv6=[] ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" HandleID="k8s-pod-network.5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" Dec 16 12:54:38.475345 containerd[1649]: 2025-12-16 12:54:38.455 [INFO][4714] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Namespace="calico-system" Pod="csi-node-driver-xdkpf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b5e01eba-2e7b-44aa-9650-696a129f0a90", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"", Pod:"csi-node-driver-xdkpf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.106.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali31ff52d25c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:38.475345 containerd[1649]: 2025-12-16 12:54:38.456 [INFO][4714] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.7/32] ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Namespace="calico-system" Pod="csi-node-driver-xdkpf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" Dec 16 12:54:38.475345 containerd[1649]: 2025-12-16 12:54:38.456 [INFO][4714] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali31ff52d25c3 ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Namespace="calico-system" Pod="csi-node-driver-xdkpf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" Dec 16 12:54:38.475345 containerd[1649]: 2025-12-16 12:54:38.458 [INFO][4714] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Namespace="calico-system" Pod="csi-node-driver-xdkpf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" Dec 16 12:54:38.475345 containerd[1649]: 2025-12-16 12:54:38.458 [INFO][4714] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Namespace="calico-system" Pod="csi-node-driver-xdkpf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b5e01eba-2e7b-44aa-9650-696a129f0a90", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2", Pod:"csi-node-driver-xdkpf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.106.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali31ff52d25c3", MAC:"d6:95:b5:13:ff:3c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:38.475345 containerd[1649]: 2025-12-16 12:54:38.468 [INFO][4714] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" Namespace="calico-system" Pod="csi-node-driver-xdkpf" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-csi--node--driver--xdkpf-eth0" Dec 16 12:54:38.494000 audit[4760]: NETFILTER_CFG table=filter:134 family=2 entries=56 op=nft_register_chain pid=4760 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:38.494000 audit[4760]: SYSCALL arch=c000003e syscall=46 success=yes exit=25516 a0=3 a1=7fff4e9609d0 a2=0 a3=7fff4e9609bc items=0 ppid=4033 pid=4760 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.494000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:38.506936 containerd[1649]: time="2025-12-16T12:54:38.506852069Z" level=info msg="connecting to shim 5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2" address="unix:///run/containerd/s/8cf91b55270d7e450c658516fd299f6dc628c8b9a4f40d98cbc5009d5a05d54a" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:38.533622 systemd[1]: Started cri-containerd-5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2.scope - libcontainer container 5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2. Dec 16 12:54:38.548000 audit: BPF prog-id=251 op=LOAD Dec 16 12:54:38.549000 audit: BPF prog-id=252 op=LOAD Dec 16 12:54:38.549000 audit[4780]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=4769 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.549000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562336364646363656630646434363237313537383861386237653835 Dec 16 12:54:38.549000 audit: BPF prog-id=252 op=UNLOAD Dec 16 12:54:38.549000 audit[4780]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4769 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.549000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562336364646363656630646434363237313537383861386237653835 Dec 16 12:54:38.549000 audit: BPF prog-id=253 op=LOAD Dec 16 12:54:38.549000 audit[4780]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=4769 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.549000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562336364646363656630646434363237313537383861386237653835 Dec 16 12:54:38.549000 audit: BPF prog-id=254 op=LOAD Dec 16 12:54:38.549000 audit[4780]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=4769 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.549000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562336364646363656630646434363237313537383861386237653835 Dec 16 12:54:38.550000 audit: BPF prog-id=254 op=UNLOAD Dec 16 12:54:38.550000 audit[4780]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4769 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.550000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562336364646363656630646434363237313537383861386237653835 Dec 16 12:54:38.550000 audit: BPF prog-id=253 op=UNLOAD Dec 16 12:54:38.550000 audit[4780]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4769 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.550000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562336364646363656630646434363237313537383861386237653835 Dec 16 12:54:38.550000 audit: BPF prog-id=255 op=LOAD Dec 16 12:54:38.550000 audit[4780]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=4769 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.550000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3562336364646363656630646434363237313537383861386237653835 Dec 16 12:54:38.569079 systemd-networkd[1544]: cali0e79c4b8356: Link UP Dec 16 12:54:38.591317 systemd-networkd[1544]: cali0e79c4b8356: Gained carrier Dec 16 12:54:38.594709 containerd[1649]: time="2025-12-16T12:54:38.594682562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xdkpf,Uid:b5e01eba-2e7b-44aa-9650-696a129f0a90,Namespace:calico-system,Attempt:0,} returns sandbox id \"5b3cddccef0dd462715788a8b7e852b5539cb1d0d839f637e5a76b27576ee6f2\"" Dec 16 12:54:38.602780 containerd[1649]: time="2025-12-16T12:54:38.602736786Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.372 [INFO][4717] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0 goldmane-666569f655- calico-system 83624a12-e59b-4753-81b9-815a3846bf01 829 0 2025-12-16 12:54:12 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4515-1-0-8-2e3d7ab7bb goldmane-666569f655-9bvxr eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali0e79c4b8356 [] [] }} ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Namespace="calico-system" Pod="goldmane-666569f655-9bvxr" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.372 [INFO][4717] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Namespace="calico-system" Pod="goldmane-666569f655-9bvxr" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.415 [INFO][4739] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" HandleID="k8s-pod-network.dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.415 [INFO][4739] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" HandleID="k8s-pod-network.dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5940), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-8-2e3d7ab7bb", "pod":"goldmane-666569f655-9bvxr", "timestamp":"2025-12-16 12:54:38.415055374 +0000 UTC"}, Hostname:"ci-4515-1-0-8-2e3d7ab7bb", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.415 [INFO][4739] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.444 [INFO][4739] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.445 [INFO][4739] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-8-2e3d7ab7bb' Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.519 [INFO][4739] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.527 [INFO][4739] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.532 [INFO][4739] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.536 [INFO][4739] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.538 [INFO][4739] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.538 [INFO][4739] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.540 [INFO][4739] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84 Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.544 [INFO][4739] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.555 [INFO][4739] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.8/26] block=192.168.106.0/26 handle="k8s-pod-network.dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.555 [INFO][4739] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.8/26] handle="k8s-pod-network.dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" host="ci-4515-1-0-8-2e3d7ab7bb" Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.555 [INFO][4739] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:54:38.610833 containerd[1649]: 2025-12-16 12:54:38.555 [INFO][4739] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.8/26] IPv6=[] ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" HandleID="k8s-pod-network.dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Workload="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" Dec 16 12:54:38.612031 containerd[1649]: 2025-12-16 12:54:38.558 [INFO][4717] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Namespace="calico-system" Pod="goldmane-666569f655-9bvxr" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"83624a12-e59b-4753-81b9-815a3846bf01", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"", Pod:"goldmane-666569f655-9bvxr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.106.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0e79c4b8356", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:38.612031 containerd[1649]: 2025-12-16 12:54:38.558 [INFO][4717] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.8/32] ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Namespace="calico-system" Pod="goldmane-666569f655-9bvxr" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" Dec 16 12:54:38.612031 containerd[1649]: 2025-12-16 12:54:38.559 [INFO][4717] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0e79c4b8356 ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Namespace="calico-system" Pod="goldmane-666569f655-9bvxr" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" Dec 16 12:54:38.612031 containerd[1649]: 2025-12-16 12:54:38.591 [INFO][4717] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Namespace="calico-system" Pod="goldmane-666569f655-9bvxr" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" Dec 16 12:54:38.612031 containerd[1649]: 2025-12-16 12:54:38.592 [INFO][4717] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Namespace="calico-system" Pod="goldmane-666569f655-9bvxr" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"83624a12-e59b-4753-81b9-815a3846bf01", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 54, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-8-2e3d7ab7bb", ContainerID:"dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84", Pod:"goldmane-666569f655-9bvxr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.106.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0e79c4b8356", MAC:"be:39:f5:ec:0f:e8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:54:38.612031 containerd[1649]: 2025-12-16 12:54:38.604 [INFO][4717] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" Namespace="calico-system" Pod="goldmane-666569f655-9bvxr" WorkloadEndpoint="ci--4515--1--0--8--2e3d7ab7bb-k8s-goldmane--666569f655--9bvxr-eth0" Dec 16 12:54:38.626000 audit[4816]: NETFILTER_CFG table=filter:135 family=2 entries=68 op=nft_register_chain pid=4816 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:54:38.626000 audit[4816]: SYSCALL arch=c000003e syscall=46 success=yes exit=32308 a0=3 a1=7ffdf44a0990 a2=0 a3=7ffdf44a097c items=0 ppid=4033 pid=4816 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.626000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:54:38.633577 containerd[1649]: time="2025-12-16T12:54:38.633518305Z" level=info msg="connecting to shim dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84" address="unix:///run/containerd/s/2deea836983d779e135516b6e46eaa85cc95676ca131d6819c71b20ccb2dd535" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:54:38.657291 systemd[1]: Started cri-containerd-dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84.scope - libcontainer container dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84. Dec 16 12:54:38.670000 audit: BPF prog-id=256 op=LOAD Dec 16 12:54:38.670000 audit: BPF prog-id=257 op=LOAD Dec 16 12:54:38.670000 audit[4836]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4826 pid=4836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463363133303437626235656165646334346339653836303164363961 Dec 16 12:54:38.670000 audit: BPF prog-id=257 op=UNLOAD Dec 16 12:54:38.670000 audit[4836]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4826 pid=4836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463363133303437626235656165646334346339653836303164363961 Dec 16 12:54:38.670000 audit: BPF prog-id=258 op=LOAD Dec 16 12:54:38.670000 audit[4836]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4826 pid=4836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463363133303437626235656165646334346339653836303164363961 Dec 16 12:54:38.670000 audit: BPF prog-id=259 op=LOAD Dec 16 12:54:38.670000 audit[4836]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4826 pid=4836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463363133303437626235656165646334346339653836303164363961 Dec 16 12:54:38.670000 audit: BPF prog-id=259 op=UNLOAD Dec 16 12:54:38.670000 audit[4836]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4826 pid=4836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463363133303437626235656165646334346339653836303164363961 Dec 16 12:54:38.670000 audit: BPF prog-id=258 op=UNLOAD Dec 16 12:54:38.670000 audit[4836]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4826 pid=4836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463363133303437626235656165646334346339653836303164363961 Dec 16 12:54:38.670000 audit: BPF prog-id=260 op=LOAD Dec 16 12:54:38.670000 audit[4836]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4826 pid=4836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:38.670000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463363133303437626235656165646334346339653836303164363961 Dec 16 12:54:38.708065 containerd[1649]: time="2025-12-16T12:54:38.708012818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9bvxr,Uid:83624a12-e59b-4753-81b9-815a3846bf01,Namespace:calico-system,Attempt:0,} returns sandbox id \"dc613047bb5eaedc44c9e8601d69a3801ec1b2d7c65c910c623ed12e2d5c3d84\"" Dec 16 12:54:39.017849 containerd[1649]: time="2025-12-16T12:54:39.017776988Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:39.018903 containerd[1649]: time="2025-12-16T12:54:39.018843181Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:54:39.018986 containerd[1649]: time="2025-12-16T12:54:39.018957355Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:39.019138 kubelet[2807]: E1216 12:54:39.019094 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:54:39.019445 kubelet[2807]: E1216 12:54:39.019148 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:54:39.019554 kubelet[2807]: E1216 12:54:39.019492 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:39.020909 containerd[1649]: time="2025-12-16T12:54:39.020874167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:54:39.451118 containerd[1649]: time="2025-12-16T12:54:39.450992360Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:39.452223 containerd[1649]: time="2025-12-16T12:54:39.452113587Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:54:39.452223 containerd[1649]: time="2025-12-16T12:54:39.452218093Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:39.452540 kubelet[2807]: E1216 12:54:39.452487 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:54:39.452603 kubelet[2807]: E1216 12:54:39.452552 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:54:39.453356 containerd[1649]: time="2025-12-16T12:54:39.453331786Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:54:39.454147 kubelet[2807]: E1216 12:54:39.453868 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7gp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9bvxr_calico-system(83624a12-e59b-4753-81b9-815a3846bf01): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:39.455514 kubelet[2807]: E1216 12:54:39.455471 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:54:39.788365 systemd-networkd[1544]: cali0e79c4b8356: Gained IPv6LL Dec 16 12:54:39.789844 systemd-networkd[1544]: cali31ff52d25c3: Gained IPv6LL Dec 16 12:54:39.795049 kubelet[2807]: E1216 12:54:39.794772 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:54:39.875057 containerd[1649]: time="2025-12-16T12:54:39.875011225Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:39.878178 containerd[1649]: time="2025-12-16T12:54:39.876351473Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:54:39.878572 kubelet[2807]: E1216 12:54:39.878541 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:54:39.878640 containerd[1649]: time="2025-12-16T12:54:39.876512475Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:39.878750 kubelet[2807]: E1216 12:54:39.878733 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:54:39.878953 kubelet[2807]: E1216 12:54:39.878921 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:39.880233 kubelet[2807]: E1216 12:54:39.880191 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:54:39.986398 kernel: kauditd_printk_skb: 221 callbacks suppressed Dec 16 12:54:39.986545 kernel: audit: type=1325 audit(1765889679.980:746): table=filter:136 family=2 entries=14 op=nft_register_rule pid=4865 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:39.980000 audit[4865]: NETFILTER_CFG table=filter:136 family=2 entries=14 op=nft_register_rule pid=4865 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:39.980000 audit[4865]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffcd183b9b0 a2=0 a3=7ffcd183b99c items=0 ppid=2949 pid=4865 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:39.999082 kernel: audit: type=1300 audit(1765889679.980:746): arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffcd183b9b0 a2=0 a3=7ffcd183b99c items=0 ppid=2949 pid=4865 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:39.999293 kernel: audit: type=1327 audit(1765889679.980:746): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:39.980000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:39.986000 audit[4865]: NETFILTER_CFG table=nat:137 family=2 entries=20 op=nft_register_rule pid=4865 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:40.005348 kernel: audit: type=1325 audit(1765889679.986:747): table=nat:137 family=2 entries=20 op=nft_register_rule pid=4865 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:54:40.005979 kernel: audit: type=1300 audit(1765889679.986:747): arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffcd183b9b0 a2=0 a3=7ffcd183b99c items=0 ppid=2949 pid=4865 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:39.986000 audit[4865]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffcd183b9b0 a2=0 a3=7ffcd183b99c items=0 ppid=2949 pid=4865 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:54:40.014432 kernel: audit: type=1327 audit(1765889679.986:747): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:39.986000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:54:40.811379 kubelet[2807]: E1216 12:54:40.811131 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:54:43.316328 containerd[1649]: time="2025-12-16T12:54:43.316046856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:54:43.742741 containerd[1649]: time="2025-12-16T12:54:43.742257312Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:43.744113 containerd[1649]: time="2025-12-16T12:54:43.744035962Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:54:43.744113 containerd[1649]: time="2025-12-16T12:54:43.744069635Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:43.744516 kubelet[2807]: E1216 12:54:43.744393 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:54:43.744516 kubelet[2807]: E1216 12:54:43.744445 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:54:43.745200 kubelet[2807]: E1216 12:54:43.744559 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:6195276ecb614f0fa525b9a7c33c407b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nv486,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6746c4dc5c-qzpt8_calico-system(662f30c6-4ed6-44dc-96b4-74080eea2751): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:43.748076 containerd[1649]: time="2025-12-16T12:54:43.748049600Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:54:44.173233 containerd[1649]: time="2025-12-16T12:54:44.173177525Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:44.174518 containerd[1649]: time="2025-12-16T12:54:44.174432241Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:54:44.174960 containerd[1649]: time="2025-12-16T12:54:44.174521689Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:44.175025 kubelet[2807]: E1216 12:54:44.174642 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:54:44.175025 kubelet[2807]: E1216 12:54:44.174710 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:54:44.175025 kubelet[2807]: E1216 12:54:44.174859 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nv486,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6746c4dc5c-qzpt8_calico-system(662f30c6-4ed6-44dc-96b4-74080eea2751): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:44.176494 kubelet[2807]: E1216 12:54:44.176442 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:54:47.315346 containerd[1649]: time="2025-12-16T12:54:47.315286267Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:54:47.765322 containerd[1649]: time="2025-12-16T12:54:47.765266145Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:47.766419 containerd[1649]: time="2025-12-16T12:54:47.766376970Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:54:47.766517 containerd[1649]: time="2025-12-16T12:54:47.766459444Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:47.766695 kubelet[2807]: E1216 12:54:47.766658 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:54:47.766915 kubelet[2807]: E1216 12:54:47.766725 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:54:47.767023 kubelet[2807]: E1216 12:54:47.766978 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn857,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f5b8bcc75-wg5dd_calico-apiserver(9345e167-5638-4038-a959-3d55222d2d5c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:47.768204 kubelet[2807]: E1216 12:54:47.768177 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:54:49.315395 containerd[1649]: time="2025-12-16T12:54:49.315068686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:54:49.747180 containerd[1649]: time="2025-12-16T12:54:49.745860607Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:49.747732 containerd[1649]: time="2025-12-16T12:54:49.747679231Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:54:49.747829 containerd[1649]: time="2025-12-16T12:54:49.747798244Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:49.748030 kubelet[2807]: E1216 12:54:49.747966 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:54:49.748030 kubelet[2807]: E1216 12:54:49.748019 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:54:49.748633 kubelet[2807]: E1216 12:54:49.748122 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4bft4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f5b8bcc75-trldf_calico-apiserver(4afbd6d6-aac3-4d68-be88-76917639058c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:49.749394 kubelet[2807]: E1216 12:54:49.749351 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:54:51.318031 containerd[1649]: time="2025-12-16T12:54:51.317377577Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:54:51.743149 containerd[1649]: time="2025-12-16T12:54:51.743028489Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:51.744263 containerd[1649]: time="2025-12-16T12:54:51.744142090Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:54:51.744263 containerd[1649]: time="2025-12-16T12:54:51.744205449Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:51.744553 kubelet[2807]: E1216 12:54:51.744492 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:54:51.745190 kubelet[2807]: E1216 12:54:51.744565 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:54:51.745190 kubelet[2807]: E1216 12:54:51.745018 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7gp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9bvxr_calico-system(83624a12-e59b-4753-81b9-815a3846bf01): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:51.745316 containerd[1649]: time="2025-12-16T12:54:51.744905132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:54:51.746441 kubelet[2807]: E1216 12:54:51.746411 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:54:52.176327 containerd[1649]: time="2025-12-16T12:54:52.176003113Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:52.177753 containerd[1649]: time="2025-12-16T12:54:52.177628232Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:54:52.177753 containerd[1649]: time="2025-12-16T12:54:52.177695428Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:52.178768 kubelet[2807]: E1216 12:54:52.178593 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:54:52.179075 kubelet[2807]: E1216 12:54:52.178910 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:54:52.179392 kubelet[2807]: E1216 12:54:52.179233 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pk96r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7cc84b5-xz9fx_calico-system(0318a864-5985-4f05-83eb-6e5fed8acf7e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:52.180439 containerd[1649]: time="2025-12-16T12:54:52.179975939Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:54:52.181051 kubelet[2807]: E1216 12:54:52.180985 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:54:52.605772 containerd[1649]: time="2025-12-16T12:54:52.605421917Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:52.607571 containerd[1649]: time="2025-12-16T12:54:52.607387124Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:54:52.607571 containerd[1649]: time="2025-12-16T12:54:52.607471964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:52.610199 kubelet[2807]: E1216 12:54:52.609541 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:54:52.610199 kubelet[2807]: E1216 12:54:52.609646 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:54:52.610199 kubelet[2807]: E1216 12:54:52.609813 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:52.612922 containerd[1649]: time="2025-12-16T12:54:52.612887779Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:54:53.061519 containerd[1649]: time="2025-12-16T12:54:53.061423379Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:54:53.062786 containerd[1649]: time="2025-12-16T12:54:53.062669949Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:54:53.065545 kubelet[2807]: E1216 12:54:53.064414 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:54:53.065545 kubelet[2807]: E1216 12:54:53.064479 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:54:53.065545 kubelet[2807]: E1216 12:54:53.064615 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:54:53.069003 kubelet[2807]: E1216 12:54:53.066088 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:54:53.072672 containerd[1649]: time="2025-12-16T12:54:53.062884672Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:54:57.315320 kubelet[2807]: E1216 12:54:57.315273 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:55:00.316623 kubelet[2807]: E1216 12:55:00.316575 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:55:02.317293 kubelet[2807]: E1216 12:55:02.317242 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:55:03.315593 kubelet[2807]: E1216 12:55:03.315503 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:55:04.316534 kubelet[2807]: E1216 12:55:04.316450 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:55:05.314785 kubelet[2807]: E1216 12:55:05.314708 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:55:10.318854 containerd[1649]: time="2025-12-16T12:55:10.318811260Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:55:10.755974 containerd[1649]: time="2025-12-16T12:55:10.755927617Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:10.757314 containerd[1649]: time="2025-12-16T12:55:10.757274958Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:55:10.757474 containerd[1649]: time="2025-12-16T12:55:10.757349436Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:10.757516 kubelet[2807]: E1216 12:55:10.757470 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:55:10.758117 kubelet[2807]: E1216 12:55:10.757513 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:55:10.758117 kubelet[2807]: E1216 12:55:10.757618 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:6195276ecb614f0fa525b9a7c33c407b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nv486,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6746c4dc5c-qzpt8_calico-system(662f30c6-4ed6-44dc-96b4-74080eea2751): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:10.760542 containerd[1649]: time="2025-12-16T12:55:10.760494033Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:55:11.195710 containerd[1649]: time="2025-12-16T12:55:11.195575965Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:11.196684 containerd[1649]: time="2025-12-16T12:55:11.196627827Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:55:11.196758 containerd[1649]: time="2025-12-16T12:55:11.196719387Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:11.196911 kubelet[2807]: E1216 12:55:11.196847 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:55:11.196911 kubelet[2807]: E1216 12:55:11.196897 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:55:11.197047 kubelet[2807]: E1216 12:55:11.197002 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nv486,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6746c4dc5c-qzpt8_calico-system(662f30c6-4ed6-44dc-96b4-74080eea2751): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:11.198548 kubelet[2807]: E1216 12:55:11.198508 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:55:14.316148 containerd[1649]: time="2025-12-16T12:55:14.315864014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:55:14.755189 containerd[1649]: time="2025-12-16T12:55:14.755108192Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:14.756379 containerd[1649]: time="2025-12-16T12:55:14.756335954Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:55:14.756520 containerd[1649]: time="2025-12-16T12:55:14.756423336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:14.756627 kubelet[2807]: E1216 12:55:14.756576 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:55:14.756982 kubelet[2807]: E1216 12:55:14.756636 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:55:14.756982 kubelet[2807]: E1216 12:55:14.756777 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4bft4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f5b8bcc75-trldf_calico-apiserver(4afbd6d6-aac3-4d68-be88-76917639058c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:14.758374 kubelet[2807]: E1216 12:55:14.758324 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:55:15.317056 containerd[1649]: time="2025-12-16T12:55:15.316772074Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:55:15.759876 containerd[1649]: time="2025-12-16T12:55:15.759827353Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:15.761177 containerd[1649]: time="2025-12-16T12:55:15.761086623Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:55:15.761398 containerd[1649]: time="2025-12-16T12:55:15.761210182Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:15.761491 kubelet[2807]: E1216 12:55:15.761401 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:55:15.762294 kubelet[2807]: E1216 12:55:15.761529 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:55:15.762294 kubelet[2807]: E1216 12:55:15.761703 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7gp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9bvxr_calico-system(83624a12-e59b-4753-81b9-815a3846bf01): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:15.762906 kubelet[2807]: E1216 12:55:15.762862 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:55:16.315469 containerd[1649]: time="2025-12-16T12:55:16.315431329Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:55:16.737492 containerd[1649]: time="2025-12-16T12:55:16.737006257Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:16.738114 containerd[1649]: time="2025-12-16T12:55:16.738053102Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:55:16.738228 containerd[1649]: time="2025-12-16T12:55:16.738142128Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:16.738656 kubelet[2807]: E1216 12:55:16.738564 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:55:16.738656 kubelet[2807]: E1216 12:55:16.738632 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:55:16.738997 kubelet[2807]: E1216 12:55:16.738934 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pk96r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7cc84b5-xz9fx_calico-system(0318a864-5985-4f05-83eb-6e5fed8acf7e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:16.740650 kubelet[2807]: E1216 12:55:16.740364 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:55:17.317463 containerd[1649]: time="2025-12-16T12:55:17.317399333Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:55:17.759133 containerd[1649]: time="2025-12-16T12:55:17.759064493Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:17.761129 containerd[1649]: time="2025-12-16T12:55:17.760952023Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:55:17.761641 containerd[1649]: time="2025-12-16T12:55:17.761028034Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:17.762318 kubelet[2807]: E1216 12:55:17.762232 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:55:17.763973 kubelet[2807]: E1216 12:55:17.762457 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:55:17.764247 kubelet[2807]: E1216 12:55:17.763471 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:17.768646 containerd[1649]: time="2025-12-16T12:55:17.768503054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:55:18.212456 containerd[1649]: time="2025-12-16T12:55:18.212349475Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:18.214345 containerd[1649]: time="2025-12-16T12:55:18.214098047Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:55:18.214345 containerd[1649]: time="2025-12-16T12:55:18.214195789Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:18.214827 kubelet[2807]: E1216 12:55:18.214593 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:55:18.214827 kubelet[2807]: E1216 12:55:18.214804 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:55:18.215648 kubelet[2807]: E1216 12:55:18.215600 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:18.217435 kubelet[2807]: E1216 12:55:18.217398 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:55:18.319066 containerd[1649]: time="2025-12-16T12:55:18.318770071Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:55:18.760019 containerd[1649]: time="2025-12-16T12:55:18.759846918Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:18.761074 containerd[1649]: time="2025-12-16T12:55:18.761027554Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:55:18.761757 containerd[1649]: time="2025-12-16T12:55:18.761102654Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:18.761806 kubelet[2807]: E1216 12:55:18.761294 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:55:18.761806 kubelet[2807]: E1216 12:55:18.761344 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:55:18.761806 kubelet[2807]: E1216 12:55:18.761471 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn857,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f5b8bcc75-wg5dd_calico-apiserver(9345e167-5638-4038-a959-3d55222d2d5c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:18.762776 kubelet[2807]: E1216 12:55:18.762714 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:55:26.317270 kubelet[2807]: E1216 12:55:26.317214 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:55:28.316304 kubelet[2807]: E1216 12:55:28.316262 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:55:28.948352 systemd[1]: Started sshd@7-77.42.41.174:22-147.75.109.163:55628.service - OpenSSH per-connection server daemon (147.75.109.163:55628). Dec 16 12:55:28.948000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.41.174:22-147.75.109.163:55628 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:28.956189 kernel: audit: type=1130 audit(1765889728.948:748): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.41.174:22-147.75.109.163:55628 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:29.314810 kubelet[2807]: E1216 12:55:29.314756 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:55:29.959000 audit[4927]: USER_ACCT pid=4927 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:29.966220 kernel: audit: type=1101 audit(1765889729.959:749): pid=4927 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:29.966300 sshd[4927]: Accepted publickey for core from 147.75.109.163 port 55628 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:55:29.966000 audit[4927]: CRED_ACQ pid=4927 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:29.973265 kernel: audit: type=1103 audit(1765889729.966:750): pid=4927 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:29.966000 audit[4927]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdcad12690 a2=3 a3=0 items=0 ppid=1 pid=4927 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:29.978529 kernel: audit: type=1006 audit(1765889729.966:751): pid=4927 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=8 res=1 Dec 16 12:55:29.978574 kernel: audit: type=1300 audit(1765889729.966:751): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdcad12690 a2=3 a3=0 items=0 ppid=1 pid=4927 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:29.983610 sshd-session[4927]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:55:29.985576 kernel: audit: type=1327 audit(1765889729.966:751): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:29.966000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:29.995397 systemd-logind[1616]: New session 8 of user core. Dec 16 12:55:30.004427 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 12:55:30.010000 audit[4927]: USER_START pid=4927 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:30.018000 audit[4930]: CRED_ACQ pid=4930 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:30.019729 kernel: audit: type=1105 audit(1765889730.010:752): pid=4927 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:30.019782 kernel: audit: type=1103 audit(1765889730.018:753): pid=4930 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:30.316298 kubelet[2807]: E1216 12:55:30.316222 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:55:30.319637 kubelet[2807]: E1216 12:55:30.318699 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:55:31.072717 sshd[4930]: Connection closed by 147.75.109.163 port 55628 Dec 16 12:55:31.074405 sshd-session[4927]: pam_unix(sshd:session): session closed for user core Dec 16 12:55:31.076000 audit[4927]: USER_END pid=4927 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:31.080645 systemd-logind[1616]: Session 8 logged out. Waiting for processes to exit. Dec 16 12:55:31.081320 systemd[1]: sshd@7-77.42.41.174:22-147.75.109.163:55628.service: Deactivated successfully. Dec 16 12:55:31.083807 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 12:55:31.085894 systemd-logind[1616]: Removed session 8. Dec 16 12:55:31.087185 kernel: audit: type=1106 audit(1765889731.076:754): pid=4927 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:31.076000 audit[4927]: CRED_DISP pid=4927 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:31.079000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.41.174:22-147.75.109.163:55628 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:31.096186 kernel: audit: type=1104 audit(1765889731.076:755): pid=4927 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:31.314017 kubelet[2807]: E1216 12:55:31.313967 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:55:36.242590 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:55:36.244028 kernel: audit: type=1130 audit(1765889736.234:757): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.41.174:22-147.75.109.163:42870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:36.234000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.41.174:22-147.75.109.163:42870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:36.235506 systemd[1]: Started sshd@8-77.42.41.174:22-147.75.109.163:42870.service - OpenSSH per-connection server daemon (147.75.109.163:42870). Dec 16 12:55:37.167000 audit[4968]: USER_ACCT pid=4968 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.172309 sshd[4968]: Accepted publickey for core from 147.75.109.163 port 42870 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:55:37.173516 sshd-session[4968]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:55:37.177178 kernel: audit: type=1101 audit(1765889737.167:758): pid=4968 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.167000 audit[4968]: CRED_ACQ pid=4968 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.186355 kernel: audit: type=1103 audit(1765889737.167:759): pid=4968 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.186430 kernel: audit: type=1006 audit(1765889737.167:760): pid=4968 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Dec 16 12:55:37.167000 audit[4968]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc07cf96e0 a2=3 a3=0 items=0 ppid=1 pid=4968 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:37.191971 kernel: audit: type=1300 audit(1765889737.167:760): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc07cf96e0 a2=3 a3=0 items=0 ppid=1 pid=4968 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:37.200828 systemd-logind[1616]: New session 9 of user core. Dec 16 12:55:37.167000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:37.204181 kernel: audit: type=1327 audit(1765889737.167:760): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:37.205294 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 12:55:37.206000 audit[4968]: USER_START pid=4968 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.215179 kernel: audit: type=1105 audit(1765889737.206:761): pid=4968 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.216000 audit[4971]: CRED_ACQ pid=4971 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.223182 kernel: audit: type=1103 audit(1765889737.216:762): pid=4971 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.884745 sshd[4971]: Connection closed by 147.75.109.163 port 42870 Dec 16 12:55:37.885454 sshd-session[4968]: pam_unix(sshd:session): session closed for user core Dec 16 12:55:37.885000 audit[4968]: USER_END pid=4968 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.895455 systemd[1]: sshd@8-77.42.41.174:22-147.75.109.163:42870.service: Deactivated successfully. Dec 16 12:55:37.897243 kernel: audit: type=1106 audit(1765889737.885:763): pid=4968 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.897290 kernel: audit: type=1104 audit(1765889737.886:764): pid=4968 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.886000 audit[4968]: CRED_DISP pid=4968 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:37.898726 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 12:55:37.895000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.41.174:22-147.75.109.163:42870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:37.903440 systemd-logind[1616]: Session 9 logged out. Waiting for processes to exit. Dec 16 12:55:37.904409 systemd-logind[1616]: Removed session 9. Dec 16 12:55:38.317673 kubelet[2807]: E1216 12:55:38.317453 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:55:39.314134 kubelet[2807]: E1216 12:55:39.314089 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:55:43.057236 systemd[1]: Started sshd@9-77.42.41.174:22-147.75.109.163:52266.service - OpenSSH per-connection server daemon (147.75.109.163:52266). Dec 16 12:55:43.060354 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:55:43.060423 kernel: audit: type=1130 audit(1765889743.056:766): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.41.174:22-147.75.109.163:52266 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:43.056000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.41.174:22-147.75.109.163:52266 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:43.313792 kubelet[2807]: E1216 12:55:43.313675 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:55:43.318071 kubelet[2807]: E1216 12:55:43.318009 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:55:43.319029 kubelet[2807]: E1216 12:55:43.318883 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:55:43.912000 audit[4984]: USER_ACCT pid=4984 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:43.915409 sshd-session[4984]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:55:43.921632 kernel: audit: type=1101 audit(1765889743.912:767): pid=4984 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:43.921659 sshd[4984]: Accepted publickey for core from 147.75.109.163 port 52266 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:55:43.913000 audit[4984]: CRED_ACQ pid=4984 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:43.929197 kernel: audit: type=1103 audit(1765889743.913:768): pid=4984 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:43.931137 systemd-logind[1616]: New session 10 of user core. Dec 16 12:55:43.936347 kernel: audit: type=1006 audit(1765889743.913:769): pid=4984 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Dec 16 12:55:43.936075 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 12:55:43.913000 audit[4984]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc9701df50 a2=3 a3=0 items=0 ppid=1 pid=4984 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:43.948259 kernel: audit: type=1300 audit(1765889743.913:769): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc9701df50 a2=3 a3=0 items=0 ppid=1 pid=4984 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:43.948307 kernel: audit: type=1327 audit(1765889743.913:769): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:43.913000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:43.955135 kernel: audit: type=1105 audit(1765889743.937:770): pid=4984 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:43.960606 kernel: audit: type=1103 audit(1765889743.944:771): pid=4987 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:43.937000 audit[4984]: USER_START pid=4984 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:43.944000 audit[4987]: CRED_ACQ pid=4987 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:44.467237 sshd[4987]: Connection closed by 147.75.109.163 port 52266 Dec 16 12:55:44.467718 sshd-session[4984]: pam_unix(sshd:session): session closed for user core Dec 16 12:55:44.467000 audit[4984]: USER_END pid=4984 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:44.471708 systemd[1]: sshd@9-77.42.41.174:22-147.75.109.163:52266.service: Deactivated successfully. Dec 16 12:55:44.473709 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 12:55:44.476119 systemd-logind[1616]: Session 10 logged out. Waiting for processes to exit. Dec 16 12:55:44.468000 audit[4984]: CRED_DISP pid=4984 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:44.478690 kernel: audit: type=1106 audit(1765889744.467:772): pid=4984 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:44.478730 kernel: audit: type=1104 audit(1765889744.468:773): pid=4984 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:44.479346 systemd-logind[1616]: Removed session 10. Dec 16 12:55:44.470000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.41.174:22-147.75.109.163:52266 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:44.675398 systemd[1]: Started sshd@10-77.42.41.174:22-147.75.109.163:52278.service - OpenSSH per-connection server daemon (147.75.109.163:52278). Dec 16 12:55:44.674000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.41.174:22-147.75.109.163:52278 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:45.315661 kubelet[2807]: E1216 12:55:45.315620 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:55:46.040000 audit[5000]: USER_ACCT pid=5000 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:46.041792 sshd[5000]: Accepted publickey for core from 147.75.109.163 port 52278 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:55:46.041000 audit[5000]: CRED_ACQ pid=5000 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:46.041000 audit[5000]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd36c7b920 a2=3 a3=0 items=0 ppid=1 pid=5000 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:46.041000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:46.043100 sshd-session[5000]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:55:46.048936 systemd-logind[1616]: New session 11 of user core. Dec 16 12:55:46.054404 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 12:55:46.056000 audit[5000]: USER_START pid=5000 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:46.058000 audit[5003]: CRED_ACQ pid=5003 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:46.718279 sshd[5003]: Connection closed by 147.75.109.163 port 52278 Dec 16 12:55:46.722240 sshd-session[5000]: pam_unix(sshd:session): session closed for user core Dec 16 12:55:46.722000 audit[5000]: USER_END pid=5000 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:46.723000 audit[5000]: CRED_DISP pid=5000 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:46.725000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.41.174:22-147.75.109.163:52278 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:46.726591 systemd[1]: sshd@10-77.42.41.174:22-147.75.109.163:52278.service: Deactivated successfully. Dec 16 12:55:46.728750 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 12:55:46.730194 systemd-logind[1616]: Session 11 logged out. Waiting for processes to exit. Dec 16 12:55:46.734379 systemd-logind[1616]: Removed session 11. Dec 16 12:55:46.907743 systemd[1]: Started sshd@11-77.42.41.174:22-147.75.109.163:52294.service - OpenSSH per-connection server daemon (147.75.109.163:52294). Dec 16 12:55:46.906000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.41.174:22-147.75.109.163:52294 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:48.708467 kernel: kauditd_printk_skb: 13 callbacks suppressed Dec 16 12:55:48.708582 kernel: audit: type=1101 audit(1765889748.699:785): pid=5013 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:48.699000 audit[5013]: USER_ACCT pid=5013 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:48.701734 sshd-session[5013]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:55:48.708859 sshd[5013]: Accepted publickey for core from 147.75.109.163 port 52294 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:55:48.699000 audit[5013]: CRED_ACQ pid=5013 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:48.716177 kernel: audit: type=1103 audit(1765889748.699:786): pid=5013 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:48.715911 systemd-logind[1616]: New session 12 of user core. Dec 16 12:55:48.728033 kernel: audit: type=1006 audit(1765889748.699:787): pid=5013 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Dec 16 12:55:48.728083 kernel: audit: type=1300 audit(1765889748.699:787): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee0576ee0 a2=3 a3=0 items=0 ppid=1 pid=5013 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:48.699000 audit[5013]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee0576ee0 a2=3 a3=0 items=0 ppid=1 pid=5013 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:48.721562 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 12:55:48.733967 kernel: audit: type=1327 audit(1765889748.699:787): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:48.733992 kernel: audit: type=1105 audit(1765889748.731:788): pid=5013 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:48.699000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:48.731000 audit[5013]: USER_START pid=5013 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:48.732000 audit[5020]: CRED_ACQ pid=5020 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:48.743625 kernel: audit: type=1103 audit(1765889748.732:789): pid=5020 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:49.442330 sshd[5020]: Connection closed by 147.75.109.163 port 52294 Dec 16 12:55:49.442880 sshd-session[5013]: pam_unix(sshd:session): session closed for user core Dec 16 12:55:49.445000 audit[5013]: USER_END pid=5013 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:49.456297 kernel: audit: type=1106 audit(1765889749.445:790): pid=5013 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:49.450613 systemd-logind[1616]: Session 12 logged out. Waiting for processes to exit. Dec 16 12:55:49.453582 systemd[1]: sshd@11-77.42.41.174:22-147.75.109.163:52294.service: Deactivated successfully. Dec 16 12:55:49.456611 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 12:55:49.445000 audit[5013]: CRED_DISP pid=5013 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:49.459381 systemd-logind[1616]: Removed session 12. Dec 16 12:55:49.467025 kernel: audit: type=1104 audit(1765889749.445:791): pid=5013 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:49.476245 kernel: audit: type=1131 audit(1765889749.447:792): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.41.174:22-147.75.109.163:52294 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:49.447000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.41.174:22-147.75.109.163:52294 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:50.314517 kubelet[2807]: E1216 12:55:50.314030 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:55:51.315090 containerd[1649]: time="2025-12-16T12:55:51.315043822Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:55:51.761174 containerd[1649]: time="2025-12-16T12:55:51.760845911Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:51.762087 containerd[1649]: time="2025-12-16T12:55:51.762061011Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:55:51.762475 containerd[1649]: time="2025-12-16T12:55:51.762280392Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:51.763006 kubelet[2807]: E1216 12:55:51.762724 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:55:51.763255 kubelet[2807]: E1216 12:55:51.763012 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:55:51.763255 kubelet[2807]: E1216 12:55:51.763115 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:6195276ecb614f0fa525b9a7c33c407b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nv486,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6746c4dc5c-qzpt8_calico-system(662f30c6-4ed6-44dc-96b4-74080eea2751): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:51.765540 containerd[1649]: time="2025-12-16T12:55:51.765496087Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:55:52.206652 containerd[1649]: time="2025-12-16T12:55:52.206073947Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:52.207260 containerd[1649]: time="2025-12-16T12:55:52.207222072Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:55:52.208332 containerd[1649]: time="2025-12-16T12:55:52.207306480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:52.208600 kubelet[2807]: E1216 12:55:52.208518 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:55:52.208600 kubelet[2807]: E1216 12:55:52.208586 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:55:52.208876 kubelet[2807]: E1216 12:55:52.208737 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nv486,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6746c4dc5c-qzpt8_calico-system(662f30c6-4ed6-44dc-96b4-74080eea2751): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:52.209916 kubelet[2807]: E1216 12:55:52.209855 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:55:54.536000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.41.174:22-147.75.109.163:57752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:54.537857 systemd[1]: Started sshd@12-77.42.41.174:22-147.75.109.163:57752.service - OpenSSH per-connection server daemon (147.75.109.163:57752). Dec 16 12:55:54.546226 kernel: audit: type=1130 audit(1765889754.536:793): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.41.174:22-147.75.109.163:57752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:55.314691 kubelet[2807]: E1216 12:55:55.314338 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:55:55.316885 kubelet[2807]: E1216 12:55:55.316189 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:55:55.484000 audit[5040]: USER_ACCT pid=5040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:55.487625 sshd-session[5040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:55:55.488217 sshd[5040]: Accepted publickey for core from 147.75.109.163 port 57752 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:55:55.493203 kernel: audit: type=1101 audit(1765889755.484:794): pid=5040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:55.494267 systemd-logind[1616]: New session 13 of user core. Dec 16 12:55:55.484000 audit[5040]: CRED_ACQ pid=5040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:55.507510 kernel: audit: type=1103 audit(1765889755.484:795): pid=5040 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:55.507571 kernel: audit: type=1006 audit(1765889755.484:796): pid=5040 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Dec 16 12:55:55.513241 kernel: audit: type=1300 audit(1765889755.484:796): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffcc3df1e0 a2=3 a3=0 items=0 ppid=1 pid=5040 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:55.484000 audit[5040]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffcc3df1e0 a2=3 a3=0 items=0 ppid=1 pid=5040 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:55:55.505334 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 12:55:55.518204 kernel: audit: type=1327 audit(1765889755.484:796): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:55.484000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:55:55.514000 audit[5040]: USER_START pid=5040 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:55.526180 kernel: audit: type=1105 audit(1765889755.514:797): pid=5040 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:55.517000 audit[5043]: CRED_ACQ pid=5043 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:55.533184 kernel: audit: type=1103 audit(1765889755.517:798): pid=5043 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:56.093486 sshd[5043]: Connection closed by 147.75.109.163 port 57752 Dec 16 12:55:56.094310 sshd-session[5040]: pam_unix(sshd:session): session closed for user core Dec 16 12:55:56.094000 audit[5040]: USER_END pid=5040 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:56.106232 kernel: audit: type=1106 audit(1765889756.094:799): pid=5040 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:56.098199 systemd-logind[1616]: Session 13 logged out. Waiting for processes to exit. Dec 16 12:55:56.094000 audit[5040]: CRED_DISP pid=5040 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:56.099737 systemd[1]: sshd@12-77.42.41.174:22-147.75.109.163:57752.service: Deactivated successfully. Dec 16 12:55:56.102238 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 12:55:56.105376 systemd-logind[1616]: Removed session 13. Dec 16 12:55:56.098000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.41.174:22-147.75.109.163:57752 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:55:56.115232 kernel: audit: type=1104 audit(1765889756.094:800): pid=5040 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:55:57.315763 containerd[1649]: time="2025-12-16T12:55:57.315091421Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:55:57.749457 containerd[1649]: time="2025-12-16T12:55:57.749415873Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:55:57.750601 containerd[1649]: time="2025-12-16T12:55:57.750537290Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:55:57.750755 containerd[1649]: time="2025-12-16T12:55:57.750617700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:55:57.750858 kubelet[2807]: E1216 12:55:57.750791 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:55:57.751300 kubelet[2807]: E1216 12:55:57.750868 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:55:57.751300 kubelet[2807]: E1216 12:55:57.751044 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4bft4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f5b8bcc75-trldf_calico-apiserver(4afbd6d6-aac3-4d68-be88-76917639058c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:55:57.752510 kubelet[2807]: E1216 12:55:57.752471 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:56:00.314822 containerd[1649]: time="2025-12-16T12:56:00.314780569Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:56:00.751382 containerd[1649]: time="2025-12-16T12:56:00.751283348Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:56:00.754135 containerd[1649]: time="2025-12-16T12:56:00.754030372Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:56:00.754394 containerd[1649]: time="2025-12-16T12:56:00.754215047Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:56:00.754652 kubelet[2807]: E1216 12:56:00.754562 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:56:00.754652 kubelet[2807]: E1216 12:56:00.754634 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:56:00.756208 kubelet[2807]: E1216 12:56:00.754803 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7gp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9bvxr_calico-system(83624a12-e59b-4753-81b9-815a3846bf01): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:56:00.756208 kubelet[2807]: E1216 12:56:00.756108 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:56:01.281598 systemd[1]: Started sshd@13-77.42.41.174:22-147.75.109.163:57760.service - OpenSSH per-connection server daemon (147.75.109.163:57760). Dec 16 12:56:01.287519 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:56:01.291281 kernel: audit: type=1130 audit(1765889761.280:802): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.41.174:22-147.75.109.163:57760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:01.280000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.41.174:22-147.75.109.163:57760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:02.246000 audit[5058]: USER_ACCT pid=5058 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.249218 sshd-session[5058]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:56:02.251054 sshd[5058]: Accepted publickey for core from 147.75.109.163 port 57760 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:56:02.258221 kernel: audit: type=1101 audit(1765889762.246:803): pid=5058 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.258279 kernel: audit: type=1103 audit(1765889762.247:804): pid=5058 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.247000 audit[5058]: CRED_ACQ pid=5058 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.260860 systemd-logind[1616]: New session 14 of user core. Dec 16 12:56:02.269221 kernel: audit: type=1006 audit(1765889762.247:805): pid=5058 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Dec 16 12:56:02.247000 audit[5058]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeed98dd40 a2=3 a3=0 items=0 ppid=1 pid=5058 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:02.270640 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 12:56:02.281035 kernel: audit: type=1300 audit(1765889762.247:805): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeed98dd40 a2=3 a3=0 items=0 ppid=1 pid=5058 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:02.281086 kernel: audit: type=1327 audit(1765889762.247:805): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:02.247000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:02.291437 kernel: audit: type=1105 audit(1765889762.279:806): pid=5058 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.279000 audit[5058]: USER_START pid=5058 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.290000 audit[5061]: CRED_ACQ pid=5061 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.297798 kernel: audit: type=1103 audit(1765889762.290:807): pid=5061 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.967180 sshd[5061]: Connection closed by 147.75.109.163 port 57760 Dec 16 12:56:02.968284 sshd-session[5058]: pam_unix(sshd:session): session closed for user core Dec 16 12:56:02.968000 audit[5058]: USER_END pid=5058 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.971997 systemd[1]: sshd@13-77.42.41.174:22-147.75.109.163:57760.service: Deactivated successfully. Dec 16 12:56:02.973972 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 12:56:02.975249 systemd-logind[1616]: Session 14 logged out. Waiting for processes to exit. Dec 16 12:56:02.976242 systemd-logind[1616]: Removed session 14. Dec 16 12:56:02.983654 kernel: audit: type=1106 audit(1765889762.968:808): pid=5058 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.983705 kernel: audit: type=1104 audit(1765889762.968:809): pid=5058 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.968000 audit[5058]: CRED_DISP pid=5058 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:02.971000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.41.174:22-147.75.109.163:57760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:03.154000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-77.42.41.174:22-147.75.109.163:52816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:03.155369 systemd[1]: Started sshd@14-77.42.41.174:22-147.75.109.163:52816.service - OpenSSH per-connection server daemon (147.75.109.163:52816). Dec 16 12:56:03.351190 kubelet[2807]: E1216 12:56:03.350733 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:56:04.079000 audit[5098]: USER_ACCT pid=5098 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:04.081069 sshd[5098]: Accepted publickey for core from 147.75.109.163 port 52816 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:56:04.080000 audit[5098]: CRED_ACQ pid=5098 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:04.080000 audit[5098]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeffb4dc10 a2=3 a3=0 items=0 ppid=1 pid=5098 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:04.080000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:04.082190 sshd-session[5098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:56:04.087209 systemd-logind[1616]: New session 15 of user core. Dec 16 12:56:04.092304 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 12:56:04.093000 audit[5098]: USER_START pid=5098 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:04.095000 audit[5101]: CRED_ACQ pid=5101 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:04.317141 containerd[1649]: time="2025-12-16T12:56:04.316754719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:56:04.785385 containerd[1649]: time="2025-12-16T12:56:04.785339537Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:56:04.787410 containerd[1649]: time="2025-12-16T12:56:04.787367940Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:56:04.787489 containerd[1649]: time="2025-12-16T12:56:04.787453610Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:56:04.787654 kubelet[2807]: E1216 12:56:04.787600 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:56:04.787938 kubelet[2807]: E1216 12:56:04.787666 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:56:04.787938 kubelet[2807]: E1216 12:56:04.787774 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pk96r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7cc84b5-xz9fx_calico-system(0318a864-5985-4f05-83eb-6e5fed8acf7e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:56:04.789942 kubelet[2807]: E1216 12:56:04.789914 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:56:04.902455 sshd[5101]: Connection closed by 147.75.109.163 port 52816 Dec 16 12:56:04.912690 sshd-session[5098]: pam_unix(sshd:session): session closed for user core Dec 16 12:56:04.920000 audit[5098]: USER_END pid=5098 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:04.920000 audit[5098]: CRED_DISP pid=5098 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:04.924993 systemd-logind[1616]: Session 15 logged out. Waiting for processes to exit. Dec 16 12:56:04.925000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-77.42.41.174:22-147.75.109.163:52816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:04.926558 systemd[1]: sshd@14-77.42.41.174:22-147.75.109.163:52816.service: Deactivated successfully. Dec 16 12:56:04.930586 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 12:56:04.933436 systemd-logind[1616]: Removed session 15. Dec 16 12:56:05.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-77.42.41.174:22-147.75.109.163:52824 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:05.055288 systemd[1]: Started sshd@15-77.42.41.174:22-147.75.109.163:52824.service - OpenSSH per-connection server daemon (147.75.109.163:52824). Dec 16 12:56:05.909000 audit[5113]: USER_ACCT pid=5113 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:05.911081 sshd[5113]: Accepted publickey for core from 147.75.109.163 port 52824 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:56:05.911000 audit[5113]: CRED_ACQ pid=5113 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:05.911000 audit[5113]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff524101c0 a2=3 a3=0 items=0 ppid=1 pid=5113 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:05.911000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:05.913713 sshd-session[5113]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:56:05.920321 systemd-logind[1616]: New session 16 of user core. Dec 16 12:56:05.926314 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 12:56:05.929000 audit[5113]: USER_START pid=5113 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:05.931000 audit[5130]: CRED_ACQ pid=5130 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:06.999232 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:56:06.999370 kernel: audit: type=1325 audit(1765889766.993:826): table=filter:138 family=2 entries=26 op=nft_register_rule pid=5140 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:06.993000 audit[5140]: NETFILTER_CFG table=filter:138 family=2 entries=26 op=nft_register_rule pid=5140 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:06.993000 audit[5140]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fff0979f170 a2=0 a3=7fff0979f15c items=0 ppid=2949 pid=5140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:07.014038 kernel: audit: type=1300 audit(1765889766.993:826): arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fff0979f170 a2=0 a3=7fff0979f15c items=0 ppid=2949 pid=5140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:07.014083 kernel: audit: type=1327 audit(1765889766.993:826): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:06.993000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:07.017000 audit[5140]: NETFILTER_CFG table=nat:139 family=2 entries=20 op=nft_register_rule pid=5140 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:07.028657 kernel: audit: type=1325 audit(1765889767.017:827): table=nat:139 family=2 entries=20 op=nft_register_rule pid=5140 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:07.017000 audit[5140]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff0979f170 a2=0 a3=0 items=0 ppid=2949 pid=5140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:07.038203 kernel: audit: type=1300 audit(1765889767.017:827): arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff0979f170 a2=0 a3=0 items=0 ppid=2949 pid=5140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:07.017000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:07.044197 kernel: audit: type=1327 audit(1765889767.017:827): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:07.064000 audit[5142]: NETFILTER_CFG table=filter:140 family=2 entries=38 op=nft_register_rule pid=5142 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:07.072221 kernel: audit: type=1325 audit(1765889767.064:828): table=filter:140 family=2 entries=38 op=nft_register_rule pid=5142 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:07.064000 audit[5142]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffc6e263dc0 a2=0 a3=7ffc6e263dac items=0 ppid=2949 pid=5142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:07.064000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:07.085287 kernel: audit: type=1300 audit(1765889767.064:828): arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffc6e263dc0 a2=0 a3=7ffc6e263dac items=0 ppid=2949 pid=5142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:07.085380 kernel: audit: type=1327 audit(1765889767.064:828): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:07.071000 audit[5142]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=5142 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:07.091050 kernel: audit: type=1325 audit(1765889767.071:829): table=nat:141 family=2 entries=20 op=nft_register_rule pid=5142 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:07.071000 audit[5142]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffc6e263dc0 a2=0 a3=0 items=0 ppid=2949 pid=5142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:07.071000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:07.149354 sshd[5130]: Connection closed by 147.75.109.163 port 52824 Dec 16 12:56:07.153107 sshd-session[5113]: pam_unix(sshd:session): session closed for user core Dec 16 12:56:07.157000 audit[5113]: USER_END pid=5113 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:07.157000 audit[5113]: CRED_DISP pid=5113 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:07.160846 systemd[1]: sshd@15-77.42.41.174:22-147.75.109.163:52824.service: Deactivated successfully. Dec 16 12:56:07.160000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-77.42.41.174:22-147.75.109.163:52824 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:07.165980 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 12:56:07.169510 systemd-logind[1616]: Session 16 logged out. Waiting for processes to exit. Dec 16 12:56:07.171144 systemd-logind[1616]: Removed session 16. Dec 16 12:56:07.319623 systemd[1]: Started sshd@16-77.42.41.174:22-147.75.109.163:52838.service - OpenSSH per-connection server daemon (147.75.109.163:52838). Dec 16 12:56:07.318000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.41.174:22-147.75.109.163:52838 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:08.183000 audit[5154]: USER_ACCT pid=5154 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:08.185630 sshd[5154]: Accepted publickey for core from 147.75.109.163 port 52838 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:56:08.185000 audit[5154]: CRED_ACQ pid=5154 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:08.185000 audit[5154]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff42884a70 a2=3 a3=0 items=0 ppid=1 pid=5154 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:08.185000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:08.188120 sshd-session[5154]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:56:08.195330 systemd-logind[1616]: New session 17 of user core. Dec 16 12:56:08.200428 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 12:56:08.203000 audit[5154]: USER_START pid=5154 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:08.205000 audit[5157]: CRED_ACQ pid=5157 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:08.941480 sshd[5157]: Connection closed by 147.75.109.163 port 52838 Dec 16 12:56:08.942564 sshd-session[5154]: pam_unix(sshd:session): session closed for user core Dec 16 12:56:08.942000 audit[5154]: USER_END pid=5154 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:08.943000 audit[5154]: CRED_DISP pid=5154 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:08.946535 systemd-logind[1616]: Session 17 logged out. Waiting for processes to exit. Dec 16 12:56:08.946769 systemd[1]: sshd@16-77.42.41.174:22-147.75.109.163:52838.service: Deactivated successfully. Dec 16 12:56:08.945000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.41.174:22-147.75.109.163:52838 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:08.948732 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 12:56:08.950581 systemd-logind[1616]: Removed session 17. Dec 16 12:56:09.123000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-77.42.41.174:22-147.75.109.163:52852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:09.124511 systemd[1]: Started sshd@17-77.42.41.174:22-147.75.109.163:52852.service - OpenSSH per-connection server daemon (147.75.109.163:52852). Dec 16 12:56:09.982000 audit[5167]: USER_ACCT pid=5167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:09.984502 sshd[5167]: Accepted publickey for core from 147.75.109.163 port 52852 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:56:09.989439 sshd-session[5167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:56:09.987000 audit[5167]: CRED_ACQ pid=5167 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:09.987000 audit[5167]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff128dff0 a2=3 a3=0 items=0 ppid=1 pid=5167 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:09.987000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:10.000475 systemd-logind[1616]: New session 18 of user core. Dec 16 12:56:10.008042 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 12:56:10.021000 audit[5167]: USER_START pid=5167 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:10.025000 audit[5170]: CRED_ACQ pid=5170 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:10.320006 containerd[1649]: time="2025-12-16T12:56:10.319267210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:56:10.583273 sshd[5170]: Connection closed by 147.75.109.163 port 52852 Dec 16 12:56:10.584375 sshd-session[5167]: pam_unix(sshd:session): session closed for user core Dec 16 12:56:10.585000 audit[5167]: USER_END pid=5167 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:10.586000 audit[5167]: CRED_DISP pid=5167 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:10.590698 systemd[1]: sshd@17-77.42.41.174:22-147.75.109.163:52852.service: Deactivated successfully. Dec 16 12:56:10.591000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-77.42.41.174:22-147.75.109.163:52852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:10.595571 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 12:56:10.598440 systemd-logind[1616]: Session 18 logged out. Waiting for processes to exit. Dec 16 12:56:10.599767 systemd-logind[1616]: Removed session 18. Dec 16 12:56:10.767256 containerd[1649]: time="2025-12-16T12:56:10.767214814Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:56:10.769242 containerd[1649]: time="2025-12-16T12:56:10.769128442Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:56:10.769242 containerd[1649]: time="2025-12-16T12:56:10.769218982Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:56:10.770390 kubelet[2807]: E1216 12:56:10.770310 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:56:10.770390 kubelet[2807]: E1216 12:56:10.770375 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:56:10.771264 kubelet[2807]: E1216 12:56:10.770995 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn857,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f5b8bcc75-wg5dd_calico-apiserver(9345e167-5638-4038-a959-3d55222d2d5c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:56:10.771605 containerd[1649]: time="2025-12-16T12:56:10.771552525Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:56:10.772226 kubelet[2807]: E1216 12:56:10.772179 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:56:11.184175 containerd[1649]: time="2025-12-16T12:56:11.184125823Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:56:11.185449 containerd[1649]: time="2025-12-16T12:56:11.185373506Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:56:11.185449 containerd[1649]: time="2025-12-16T12:56:11.185405616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:56:11.186000 kubelet[2807]: E1216 12:56:11.185624 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:56:11.186000 kubelet[2807]: E1216 12:56:11.185670 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:56:11.186000 kubelet[2807]: E1216 12:56:11.185772 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:56:11.189082 containerd[1649]: time="2025-12-16T12:56:11.188890163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:56:11.314978 kubelet[2807]: E1216 12:56:11.314705 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:56:11.622121 containerd[1649]: time="2025-12-16T12:56:11.621940441Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:56:11.624194 containerd[1649]: time="2025-12-16T12:56:11.624005583Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:56:11.624194 containerd[1649]: time="2025-12-16T12:56:11.624122340Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:56:11.624504 kubelet[2807]: E1216 12:56:11.624434 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:56:11.624590 kubelet[2807]: E1216 12:56:11.624542 2807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:56:11.630256 kubelet[2807]: E1216 12:56:11.630174 2807 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-xdkpf_calico-system(b5e01eba-2e7b-44aa-9650-696a129f0a90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:56:11.631721 kubelet[2807]: E1216 12:56:11.631675 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:56:12.198199 kernel: kauditd_printk_skb: 27 callbacks suppressed Dec 16 12:56:12.198307 kernel: audit: type=1325 audit(1765889772.190:851): table=filter:142 family=2 entries=26 op=nft_register_rule pid=5182 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:12.205804 kernel: audit: type=1300 audit(1765889772.190:851): arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd32a05510 a2=0 a3=7ffd32a054fc items=0 ppid=2949 pid=5182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:12.190000 audit[5182]: NETFILTER_CFG table=filter:142 family=2 entries=26 op=nft_register_rule pid=5182 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:12.190000 audit[5182]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd32a05510 a2=0 a3=7ffd32a054fc items=0 ppid=2949 pid=5182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:12.190000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:12.219266 kernel: audit: type=1327 audit(1765889772.190:851): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:12.209000 audit[5182]: NETFILTER_CFG table=nat:143 family=2 entries=104 op=nft_register_chain pid=5182 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:12.209000 audit[5182]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffd32a05510 a2=0 a3=7ffd32a054fc items=0 ppid=2949 pid=5182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:12.233047 kernel: audit: type=1325 audit(1765889772.209:852): table=nat:143 family=2 entries=104 op=nft_register_chain pid=5182 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:56:12.233120 kernel: audit: type=1300 audit(1765889772.209:852): arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffd32a05510 a2=0 a3=7ffd32a054fc items=0 ppid=2949 pid=5182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:12.209000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:12.244352 kernel: audit: type=1327 audit(1765889772.209:852): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:56:15.315685 kubelet[2807]: E1216 12:56:15.315633 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:56:16.316741 kubelet[2807]: E1216 12:56:16.316265 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:56:16.318549 kubelet[2807]: E1216 12:56:16.318486 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:56:16.829000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.41.174:22-147.75.109.163:45062 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:16.830484 systemd[1]: Started sshd@18-77.42.41.174:22-147.75.109.163:45062.service - OpenSSH per-connection server daemon (147.75.109.163:45062). Dec 16 12:56:16.837284 kernel: audit: type=1130 audit(1765889776.829:853): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.41.174:22-147.75.109.163:45062 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:18.821191 kernel: audit: type=1101 audit(1765889778.811:854): pid=5184 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:18.811000 audit[5184]: USER_ACCT pid=5184 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:18.822391 sshd[5184]: Accepted publickey for core from 147.75.109.163 port 45062 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:56:18.823422 sshd-session[5184]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:56:18.830509 systemd-logind[1616]: New session 19 of user core. Dec 16 12:56:18.840563 kernel: audit: type=1103 audit(1765889778.821:855): pid=5184 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:18.821000 audit[5184]: CRED_ACQ pid=5184 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:18.846621 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 12:56:18.847460 kernel: audit: type=1006 audit(1765889778.821:856): pid=5184 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Dec 16 12:56:18.861368 kernel: audit: type=1300 audit(1765889778.821:856): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc99f2ffd0 a2=3 a3=0 items=0 ppid=1 pid=5184 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:18.861451 kernel: audit: type=1327 audit(1765889778.821:856): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:18.821000 audit[5184]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc99f2ffd0 a2=3 a3=0 items=0 ppid=1 pid=5184 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:18.821000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:18.850000 audit[5184]: USER_START pid=5184 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:18.867740 kernel: audit: type=1105 audit(1765889778.850:857): pid=5184 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:18.860000 audit[5187]: CRED_ACQ pid=5187 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:18.875772 kernel: audit: type=1103 audit(1765889778.860:858): pid=5187 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:19.389083 sshd[5187]: Connection closed by 147.75.109.163 port 45062 Dec 16 12:56:19.391459 sshd-session[5184]: pam_unix(sshd:session): session closed for user core Dec 16 12:56:19.392000 audit[5184]: USER_END pid=5184 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:19.398460 systemd[1]: sshd@18-77.42.41.174:22-147.75.109.163:45062.service: Deactivated successfully. Dec 16 12:56:19.401285 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 12:56:19.393000 audit[5184]: CRED_DISP pid=5184 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:19.404281 kernel: audit: type=1106 audit(1765889779.392:859): pid=5184 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:19.404337 kernel: audit: type=1104 audit(1765889779.393:860): pid=5184 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:19.408492 systemd-logind[1616]: Session 19 logged out. Waiting for processes to exit. Dec 16 12:56:19.397000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.41.174:22-147.75.109.163:45062 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:19.411173 kernel: audit: type=1131 audit(1765889779.397:861): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.41.174:22-147.75.109.163:45062 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:19.413634 systemd-logind[1616]: Removed session 19. Dec 16 12:56:22.319174 kubelet[2807]: E1216 12:56:22.319110 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:56:22.325746 kubelet[2807]: E1216 12:56:22.325712 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:56:23.315346 kubelet[2807]: E1216 12:56:23.315108 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:56:24.571000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.41.174:22-147.75.109.163:33708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:24.572234 systemd[1]: Started sshd@19-77.42.41.174:22-147.75.109.163:33708.service - OpenSSH per-connection server daemon (147.75.109.163:33708). Dec 16 12:56:24.580288 kernel: audit: type=1130 audit(1765889784.571:862): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.41.174:22-147.75.109.163:33708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:25.466000 audit[5198]: USER_ACCT pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:25.470510 sshd-session[5198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:56:25.475135 sshd[5198]: Accepted publickey for core from 147.75.109.163 port 33708 ssh2: RSA SHA256:+mYykPsH18noHTsRis8NJASgu+tKUV30q0RfKy5UyhA Dec 16 12:56:25.475275 kernel: audit: type=1101 audit(1765889785.466:863): pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:25.467000 audit[5198]: CRED_ACQ pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:25.488137 kernel: audit: type=1103 audit(1765889785.467:864): pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:25.488488 kernel: audit: type=1006 audit(1765889785.469:865): pid=5198 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Dec 16 12:56:25.469000 audit[5198]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdf886ede0 a2=3 a3=0 items=0 ppid=1 pid=5198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:25.498758 kernel: audit: type=1300 audit(1765889785.469:865): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdf886ede0 a2=3 a3=0 items=0 ppid=1 pid=5198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:25.498803 kernel: audit: type=1327 audit(1765889785.469:865): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:25.469000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:56:25.500562 systemd-logind[1616]: New session 20 of user core. Dec 16 12:56:25.506538 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 12:56:25.510000 audit[5198]: USER_START pid=5198 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:25.525874 kernel: audit: type=1105 audit(1765889785.510:866): pid=5198 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:25.525924 kernel: audit: type=1103 audit(1765889785.511:867): pid=5201 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:25.511000 audit[5201]: CRED_ACQ pid=5201 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:26.046301 sshd[5201]: Connection closed by 147.75.109.163 port 33708 Dec 16 12:56:26.046910 sshd-session[5198]: pam_unix(sshd:session): session closed for user core Dec 16 12:56:26.047000 audit[5198]: USER_END pid=5198 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:26.065914 kernel: audit: type=1106 audit(1765889786.047:868): pid=5198 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:26.065991 kernel: audit: type=1104 audit(1765889786.050:869): pid=5198 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:26.050000 audit[5198]: CRED_DISP pid=5198 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:56:26.062400 systemd-logind[1616]: Session 20 logged out. Waiting for processes to exit. Dec 16 12:56:26.063990 systemd[1]: sshd@19-77.42.41.174:22-147.75.109.163:33708.service: Deactivated successfully. Dec 16 12:56:26.063000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.41.174:22-147.75.109.163:33708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:56:26.072671 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 12:56:26.081142 systemd-logind[1616]: Removed session 20. Dec 16 12:56:28.316332 kubelet[2807]: E1216 12:56:28.316253 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:56:28.318302 kubelet[2807]: E1216 12:56:28.317202 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:56:30.315541 kubelet[2807]: E1216 12:56:30.315095 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:56:34.317557 kubelet[2807]: E1216 12:56:34.317175 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:56:35.314739 kubelet[2807]: E1216 12:56:35.314681 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:56:36.318195 kubelet[2807]: E1216 12:56:36.316809 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:56:42.323978 kubelet[2807]: E1216 12:56:42.323924 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:56:42.325705 kubelet[2807]: E1216 12:56:42.325658 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:56:43.314486 kubelet[2807]: E1216 12:56:43.314402 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:56:47.315697 kubelet[2807]: E1216 12:56:47.315346 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:56:48.314846 kubelet[2807]: E1216 12:56:48.314806 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:56:49.314109 kubelet[2807]: E1216 12:56:49.314057 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:56:53.314668 kubelet[2807]: E1216 12:56:53.314584 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e" Dec 16 12:56:53.314668 kubelet[2807]: E1216 12:56:53.314615 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:56:56.314841 kubelet[2807]: E1216 12:56:56.314779 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6746c4dc5c-qzpt8" podUID="662f30c6-4ed6-44dc-96b4-74080eea2751" Dec 16 12:56:58.080344 systemd[1]: cri-containerd-520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c.scope: Deactivated successfully. Dec 16 12:56:58.080655 systemd[1]: cri-containerd-520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c.scope: Consumed 2.976s CPU time, 87.8M memory peak, 67.3M read from disk. Dec 16 12:56:58.080000 audit: BPF prog-id=261 op=LOAD Dec 16 12:56:58.084512 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:56:58.084586 kernel: audit: type=1334 audit(1765889818.080:871): prog-id=261 op=LOAD Dec 16 12:56:58.080000 audit: BPF prog-id=85 op=UNLOAD Dec 16 12:56:58.088000 audit: BPF prog-id=108 op=UNLOAD Dec 16 12:56:58.094611 kernel: audit: type=1334 audit(1765889818.080:872): prog-id=85 op=UNLOAD Dec 16 12:56:58.094689 kernel: audit: type=1334 audit(1765889818.088:873): prog-id=108 op=UNLOAD Dec 16 12:56:58.088000 audit: BPF prog-id=112 op=UNLOAD Dec 16 12:56:58.096563 kernel: audit: type=1334 audit(1765889818.088:874): prog-id=112 op=UNLOAD Dec 16 12:56:58.110487 systemd[1]: cri-containerd-f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157.scope: Deactivated successfully. Dec 16 12:56:58.111007 systemd[1]: cri-containerd-f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157.scope: Consumed 24.773s CPU time, 131.5M memory peak, 41.6M read from disk. Dec 16 12:56:58.115000 audit: BPF prog-id=151 op=UNLOAD Dec 16 12:56:58.119884 kernel: audit: type=1334 audit(1765889818.115:875): prog-id=151 op=UNLOAD Dec 16 12:56:58.119946 kernel: audit: type=1334 audit(1765889818.115:876): prog-id=155 op=UNLOAD Dec 16 12:56:58.115000 audit: BPF prog-id=155 op=UNLOAD Dec 16 12:56:58.144743 containerd[1649]: time="2025-12-16T12:56:58.143776540Z" level=info msg="received container exit event container_id:\"520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c\" id:\"520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c\" pid:2663 exit_status:1 exited_at:{seconds:1765889818 nanos:98858297}" Dec 16 12:56:58.147142 containerd[1649]: time="2025-12-16T12:56:58.147023647Z" level=info msg="received container exit event container_id:\"f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157\" id:\"f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157\" pid:3201 exit_status:1 exited_at:{seconds:1765889818 nanos:110610685}" Dec 16 12:56:58.240665 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157-rootfs.mount: Deactivated successfully. Dec 16 12:56:58.243641 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c-rootfs.mount: Deactivated successfully. Dec 16 12:56:58.359502 systemd[1]: cri-containerd-ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c.scope: Deactivated successfully. Dec 16 12:56:58.360383 systemd[1]: cri-containerd-ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c.scope: Consumed 1.836s CPU time, 40.7M memory peak, 39.9M read from disk. Dec 16 12:56:58.364404 kernel: audit: type=1334 audit(1765889818.359:877): prog-id=262 op=LOAD Dec 16 12:56:58.359000 audit: BPF prog-id=262 op=LOAD Dec 16 12:56:58.366095 containerd[1649]: time="2025-12-16T12:56:58.365969175Z" level=info msg="received container exit event container_id:\"ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c\" id:\"ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c\" pid:2654 exit_status:1 exited_at:{seconds:1765889818 nanos:365510998}" Dec 16 12:56:58.359000 audit: BPF prog-id=90 op=UNLOAD Dec 16 12:56:58.369305 kernel: audit: type=1334 audit(1765889818.359:878): prog-id=90 op=UNLOAD Dec 16 12:56:58.368000 audit: BPF prog-id=103 op=UNLOAD Dec 16 12:56:58.375380 kernel: audit: type=1334 audit(1765889818.368:879): prog-id=103 op=UNLOAD Dec 16 12:56:58.375432 kernel: audit: type=1334 audit(1765889818.368:880): prog-id=107 op=UNLOAD Dec 16 12:56:58.368000 audit: BPF prog-id=107 op=UNLOAD Dec 16 12:56:58.399912 kubelet[2807]: E1216 12:56:58.396221 2807 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:54084->10.0.0.2:2379: read: connection timed out" Dec 16 12:56:58.399659 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c-rootfs.mount: Deactivated successfully. Dec 16 12:56:58.814682 kubelet[2807]: E1216 12:56:58.808541 2807 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:53906->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-6f5b8bcc75-trldf.1881b34dd38f8f77 calico-apiserver 1654 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-6f5b8bcc75-trldf,UID:4afbd6d6-aac3-4d68-be88-76917639058c,APIVersion:v1,ResourceVersion:823,FieldPath:spec.containers{calico-apiserver},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4515-1-0-8-2e3d7ab7bb,},FirstTimestamp:2025-12-16 12:54:35 +0000 UTC,LastTimestamp:2025-12-16 12:56:48.314762076 +0000 UTC m=+176.100416100,Count:9,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-8-2e3d7ab7bb,}" Dec 16 12:56:59.172614 kubelet[2807]: I1216 12:56:59.172505 2807 scope.go:117] "RemoveContainer" containerID="520c4ee1436f7860974e1de5747de3dfa88ef630f0fa1b643c44e1c0b18d3f5c" Dec 16 12:56:59.172928 kubelet[2807]: I1216 12:56:59.172784 2807 scope.go:117] "RemoveContainer" containerID="dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615" Dec 16 12:56:59.174450 kubelet[2807]: I1216 12:56:59.174407 2807 scope.go:117] "RemoveContainer" containerID="f47dc90e5ef3ec1f2f5c4c3b088f68fc95e6bde6372c675dd9fb6edeb0ee7157" Dec 16 12:56:59.174576 kubelet[2807]: E1216 12:56:59.174539 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-9ldhb_tigera-operator(06a450cd-e4ff-47f9-8c2c-ce6bc5fdbb05)\"" pod="tigera-operator/tigera-operator-7dcd859c48-9ldhb" podUID="06a450cd-e4ff-47f9-8c2c-ce6bc5fdbb05" Dec 16 12:56:59.174576 kubelet[2807]: I1216 12:56:59.174571 2807 scope.go:117] "RemoveContainer" containerID="ae130a7a2aa90668945b00abf5777e5f412ad3f3a8040a2c4581af8947d6522c" Dec 16 12:56:59.175428 containerd[1649]: time="2025-12-16T12:56:59.175391229Z" level=info msg="CreateContainer within sandbox \"d885db62aa1ff963adabdcb48f9b5e0fca3aacd61aacd377293e25e07f2e340c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 12:56:59.179398 containerd[1649]: time="2025-12-16T12:56:59.179274649Z" level=info msg="CreateContainer within sandbox \"94969a0ca0fa954824a585cf6ed6ae67112f1bfc39a6b743836847d7f32853d2\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 16 12:56:59.211799 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3056555473.mount: Deactivated successfully. Dec 16 12:56:59.231388 containerd[1649]: time="2025-12-16T12:56:59.230695034Z" level=info msg="Container 8d5803de0cc776cd31d6c2b9a081caae779326c859bcff6ee60933c00cd5f800: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:56:59.236469 containerd[1649]: time="2025-12-16T12:56:59.236444156Z" level=info msg="Container a396895605f40327e62ee1bb6ae8b70b79539eb0d4e6129b1b6d6dd118033992: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:56:59.242547 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3179624712.mount: Deactivated successfully. Dec 16 12:56:59.257579 containerd[1649]: time="2025-12-16T12:56:59.256536598Z" level=info msg="CreateContainer within sandbox \"d885db62aa1ff963adabdcb48f9b5e0fca3aacd61aacd377293e25e07f2e340c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"a396895605f40327e62ee1bb6ae8b70b79539eb0d4e6129b1b6d6dd118033992\"" Dec 16 12:56:59.257579 containerd[1649]: time="2025-12-16T12:56:59.257265352Z" level=info msg="StartContainer for \"a396895605f40327e62ee1bb6ae8b70b79539eb0d4e6129b1b6d6dd118033992\"" Dec 16 12:56:59.258911 containerd[1649]: time="2025-12-16T12:56:59.258362687Z" level=info msg="connecting to shim a396895605f40327e62ee1bb6ae8b70b79539eb0d4e6129b1b6d6dd118033992" address="unix:///run/containerd/s/55dcba99f89cf5d347774b19885886473d7f8eb45100aaa5b275880aa538adfb" protocol=ttrpc version=3 Dec 16 12:56:59.260598 containerd[1649]: time="2025-12-16T12:56:59.260575660Z" level=info msg="RemoveContainer for \"dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615\"" Dec 16 12:56:59.289786 containerd[1649]: time="2025-12-16T12:56:59.289735394Z" level=info msg="CreateContainer within sandbox \"94969a0ca0fa954824a585cf6ed6ae67112f1bfc39a6b743836847d7f32853d2\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"8d5803de0cc776cd31d6c2b9a081caae779326c859bcff6ee60933c00cd5f800\"" Dec 16 12:56:59.290856 containerd[1649]: time="2025-12-16T12:56:59.290792683Z" level=info msg="StartContainer for \"8d5803de0cc776cd31d6c2b9a081caae779326c859bcff6ee60933c00cd5f800\"" Dec 16 12:56:59.291635 containerd[1649]: time="2025-12-16T12:56:59.291603520Z" level=info msg="connecting to shim 8d5803de0cc776cd31d6c2b9a081caae779326c859bcff6ee60933c00cd5f800" address="unix:///run/containerd/s/671293f849e4a705eb9c9e8665b0efe15278e9f9a456893d0ea9c7377a22d34d" protocol=ttrpc version=3 Dec 16 12:56:59.305468 systemd[1]: Started cri-containerd-a396895605f40327e62ee1bb6ae8b70b79539eb0d4e6129b1b6d6dd118033992.scope - libcontainer container a396895605f40327e62ee1bb6ae8b70b79539eb0d4e6129b1b6d6dd118033992. Dec 16 12:56:59.327455 systemd[1]: Started cri-containerd-8d5803de0cc776cd31d6c2b9a081caae779326c859bcff6ee60933c00cd5f800.scope - libcontainer container 8d5803de0cc776cd31d6c2b9a081caae779326c859bcff6ee60933c00cd5f800. Dec 16 12:56:59.346000 audit: BPF prog-id=263 op=LOAD Dec 16 12:56:59.348000 audit: BPF prog-id=264 op=LOAD Dec 16 12:56:59.348000 audit[5292]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2497 pid=5292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864353830336465306363373736636433316436633262396130383163 Dec 16 12:56:59.350000 audit: BPF prog-id=264 op=UNLOAD Dec 16 12:56:59.350000 audit[5292]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2497 pid=5292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864353830336465306363373736636433316436633262396130383163 Dec 16 12:56:59.352000 audit: BPF prog-id=265 op=LOAD Dec 16 12:56:59.352000 audit[5292]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2497 pid=5292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.352000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864353830336465306363373736636433316436633262396130383163 Dec 16 12:56:59.352000 audit: BPF prog-id=266 op=LOAD Dec 16 12:56:59.352000 audit[5292]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2497 pid=5292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.352000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864353830336465306363373736636433316436633262396130383163 Dec 16 12:56:59.352000 audit: BPF prog-id=266 op=UNLOAD Dec 16 12:56:59.352000 audit[5292]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2497 pid=5292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.352000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864353830336465306363373736636433316436633262396130383163 Dec 16 12:56:59.352000 audit: BPF prog-id=265 op=UNLOAD Dec 16 12:56:59.352000 audit[5292]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2497 pid=5292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.352000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864353830336465306363373736636433316436633262396130383163 Dec 16 12:56:59.352000 audit: BPF prog-id=267 op=LOAD Dec 16 12:56:59.352000 audit[5292]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2497 pid=5292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.352000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864353830336465306363373736636433316436633262396130383163 Dec 16 12:56:59.358910 containerd[1649]: time="2025-12-16T12:56:59.358873405Z" level=info msg="RemoveContainer for \"dcb0b7c69018305fece70074ee32d0c9a6741d5cb66995525fc0f1950aedc615\" returns successfully" Dec 16 12:56:59.358000 audit: BPF prog-id=268 op=LOAD Dec 16 12:56:59.360000 audit: BPF prog-id=269 op=LOAD Dec 16 12:56:59.360000 audit[5280]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2489 pid=5280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.360000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133393638393536303566343033323765363265653162623661653862 Dec 16 12:56:59.361000 audit: BPF prog-id=269 op=UNLOAD Dec 16 12:56:59.361000 audit[5280]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=5280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133393638393536303566343033323765363265653162623661653862 Dec 16 12:56:59.361000 audit: BPF prog-id=270 op=LOAD Dec 16 12:56:59.361000 audit[5280]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2489 pid=5280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133393638393536303566343033323765363265653162623661653862 Dec 16 12:56:59.361000 audit: BPF prog-id=271 op=LOAD Dec 16 12:56:59.361000 audit[5280]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2489 pid=5280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133393638393536303566343033323765363265653162623661653862 Dec 16 12:56:59.361000 audit: BPF prog-id=271 op=UNLOAD Dec 16 12:56:59.361000 audit[5280]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=5280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133393638393536303566343033323765363265653162623661653862 Dec 16 12:56:59.361000 audit: BPF prog-id=270 op=UNLOAD Dec 16 12:56:59.361000 audit[5280]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=5280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133393638393536303566343033323765363265653162623661653862 Dec 16 12:56:59.361000 audit: BPF prog-id=272 op=LOAD Dec 16 12:56:59.361000 audit[5280]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2489 pid=5280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:56:59.361000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133393638393536303566343033323765363265653162623661653862 Dec 16 12:56:59.414776 containerd[1649]: time="2025-12-16T12:56:59.414745043Z" level=info msg="StartContainer for \"8d5803de0cc776cd31d6c2b9a081caae779326c859bcff6ee60933c00cd5f800\" returns successfully" Dec 16 12:56:59.415226 containerd[1649]: time="2025-12-16T12:56:59.414997506Z" level=info msg="StartContainer for \"a396895605f40327e62ee1bb6ae8b70b79539eb0d4e6129b1b6d6dd118033992\" returns successfully" Dec 16 12:57:01.314164 kubelet[2807]: E1216 12:57:01.314117 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-wg5dd" podUID="9345e167-5638-4038-a959-3d55222d2d5c" Dec 16 12:57:01.678619 kubelet[2807]: I1216 12:57:01.678493 2807 status_manager.go:890] "Failed to get status for pod" podUID="1f2e41c58f9938b008f633e63e0502f3" pod="kube-system/kube-controller-manager-ci-4515-1-0-8-2e3d7ab7bb" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:54022->10.0.0.2:2379: read: connection timed out" Dec 16 12:57:02.315285 kubelet[2807]: E1216 12:57:02.315070 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-xdkpf" podUID="b5e01eba-2e7b-44aa-9650-696a129f0a90" Dec 16 12:57:03.314147 kubelet[2807]: E1216 12:57:03.314092 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f5b8bcc75-trldf" podUID="4afbd6d6-aac3-4d68-be88-76917639058c" Dec 16 12:57:04.315118 kubelet[2807]: E1216 12:57:04.315057 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9bvxr" podUID="83624a12-e59b-4753-81b9-815a3846bf01" Dec 16 12:57:04.315857 kubelet[2807]: E1216 12:57:04.315191 2807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7cc84b5-xz9fx" podUID="0318a864-5985-4f05-83eb-6e5fed8acf7e"