Dec 16 12:45:27.094062 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Tue Dec 16 00:18:19 -00 2025 Dec 16 12:45:27.094083 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=553464fdb0286a5b06b399da29ca659e521c68f08ea70a931c96ddffd00b4357 Dec 16 12:45:27.094093 kernel: BIOS-provided physical RAM map: Dec 16 12:45:27.094099 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Dec 16 12:45:27.094104 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Dec 16 12:45:27.094109 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Dec 16 12:45:27.094115 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Dec 16 12:45:27.094121 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Dec 16 12:45:27.094126 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Dec 16 12:45:27.094132 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Dec 16 12:45:27.094138 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 12:45:27.094143 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Dec 16 12:45:27.094148 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Dec 16 12:45:27.094153 kernel: NX (Execute Disable) protection: active Dec 16 12:45:27.094161 kernel: APIC: Static calls initialized Dec 16 12:45:27.094166 kernel: SMBIOS 3.0.0 present. Dec 16 12:45:27.094172 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Dec 16 12:45:27.094178 kernel: DMI: Memory slots populated: 1/1 Dec 16 12:45:27.094183 kernel: Hypervisor detected: KVM Dec 16 12:45:27.094189 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Dec 16 12:45:27.094194 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 16 12:45:27.094200 kernel: kvm-clock: using sched offset of 4241317116 cycles Dec 16 12:45:27.094206 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 16 12:45:27.094214 kernel: tsc: Detected 2445.404 MHz processor Dec 16 12:45:27.094220 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 16 12:45:27.094227 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 16 12:45:27.094233 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Dec 16 12:45:27.094239 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Dec 16 12:45:27.094245 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 16 12:45:27.094251 kernel: Using GB pages for direct mapping Dec 16 12:45:27.094258 kernel: ACPI: Early table checksum verification disabled Dec 16 12:45:27.094264 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Dec 16 12:45:27.094270 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:45:27.094276 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:45:27.094283 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:45:27.094289 kernel: ACPI: FACS 0x000000007CFE0000 000040 Dec 16 12:45:27.094295 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:45:27.094301 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:45:27.094308 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:45:27.094314 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:45:27.094322 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Dec 16 12:45:27.094329 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Dec 16 12:45:27.094335 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Dec 16 12:45:27.094343 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Dec 16 12:45:27.094349 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Dec 16 12:45:27.094355 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Dec 16 12:45:27.094361 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Dec 16 12:45:27.094367 kernel: No NUMA configuration found Dec 16 12:45:27.094374 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Dec 16 12:45:27.094381 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Dec 16 12:45:27.094387 kernel: Zone ranges: Dec 16 12:45:27.094394 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 16 12:45:27.094400 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Dec 16 12:45:27.094406 kernel: Normal empty Dec 16 12:45:27.094412 kernel: Device empty Dec 16 12:45:27.094418 kernel: Movable zone start for each node Dec 16 12:45:27.094424 kernel: Early memory node ranges Dec 16 12:45:27.094432 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Dec 16 12:45:27.094438 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Dec 16 12:45:27.094444 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Dec 16 12:45:27.094450 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 12:45:27.094457 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Dec 16 12:45:27.094463 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Dec 16 12:45:27.094469 kernel: ACPI: PM-Timer IO Port: 0x608 Dec 16 12:45:27.094477 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 16 12:45:27.094483 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 16 12:45:27.094489 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 16 12:45:27.094512 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 16 12:45:27.094519 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 16 12:45:27.098254 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 16 12:45:27.098281 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 16 12:45:27.098293 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 16 12:45:27.098310 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Dec 16 12:45:27.098321 kernel: CPU topo: Max. logical packages: 1 Dec 16 12:45:27.098332 kernel: CPU topo: Max. logical dies: 1 Dec 16 12:45:27.098350 kernel: CPU topo: Max. dies per package: 1 Dec 16 12:45:27.098362 kernel: CPU topo: Max. threads per core: 1 Dec 16 12:45:27.098369 kernel: CPU topo: Num. cores per package: 2 Dec 16 12:45:27.098375 kernel: CPU topo: Num. threads per package: 2 Dec 16 12:45:27.098381 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Dec 16 12:45:27.098391 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Dec 16 12:45:27.098398 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Dec 16 12:45:27.098404 kernel: Booting paravirtualized kernel on KVM Dec 16 12:45:27.098410 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 16 12:45:27.098417 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Dec 16 12:45:27.098423 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Dec 16 12:45:27.098430 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Dec 16 12:45:27.098438 kernel: pcpu-alloc: [0] 0 1 Dec 16 12:45:27.098444 kernel: kvm-guest: PV spinlocks disabled, no host support Dec 16 12:45:27.098457 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=553464fdb0286a5b06b399da29ca659e521c68f08ea70a931c96ddffd00b4357 Dec 16 12:45:27.098472 kernel: random: crng init done Dec 16 12:45:27.098488 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 12:45:27.098514 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 12:45:27.098541 kernel: Fallback order for Node 0: 0 Dec 16 12:45:27.098556 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Dec 16 12:45:27.098567 kernel: Policy zone: DMA32 Dec 16 12:45:27.098579 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 12:45:27.098596 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 16 12:45:27.098606 kernel: ftrace: allocating 40103 entries in 157 pages Dec 16 12:45:27.098613 kernel: ftrace: allocated 157 pages with 5 groups Dec 16 12:45:27.098619 kernel: Dynamic Preempt: voluntary Dec 16 12:45:27.098629 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 12:45:27.098636 kernel: rcu: RCU event tracing is enabled. Dec 16 12:45:27.098642 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 16 12:45:27.098648 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 12:45:27.098655 kernel: Rude variant of Tasks RCU enabled. Dec 16 12:45:27.098661 kernel: Tracing variant of Tasks RCU enabled. Dec 16 12:45:27.098668 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 12:45:27.098676 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 16 12:45:27.098682 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:45:27.098689 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:45:27.098700 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:45:27.098712 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Dec 16 12:45:27.098730 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 12:45:27.098742 kernel: Console: colour VGA+ 80x25 Dec 16 12:45:27.098758 kernel: printk: legacy console [tty0] enabled Dec 16 12:45:27.098769 kernel: printk: legacy console [ttyS0] enabled Dec 16 12:45:27.098780 kernel: ACPI: Core revision 20240827 Dec 16 12:45:27.098804 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Dec 16 12:45:27.098820 kernel: APIC: Switch to symmetric I/O mode setup Dec 16 12:45:27.098832 kernel: x2apic enabled Dec 16 12:45:27.098845 kernel: APIC: Switched APIC routing to: physical x2apic Dec 16 12:45:27.098863 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Dec 16 12:45:27.098875 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns Dec 16 12:45:27.098892 kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404) Dec 16 12:45:27.098912 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 16 12:45:27.098924 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Dec 16 12:45:27.098936 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Dec 16 12:45:27.098948 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 16 12:45:27.098963 kernel: Spectre V2 : Mitigation: Retpolines Dec 16 12:45:27.098974 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Dec 16 12:45:27.098991 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Dec 16 12:45:27.099004 kernel: active return thunk: retbleed_return_thunk Dec 16 12:45:27.099012 kernel: RETBleed: Mitigation: untrained return thunk Dec 16 12:45:27.099018 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 16 12:45:27.099025 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 16 12:45:27.099035 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 16 12:45:27.099042 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 16 12:45:27.099048 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 16 12:45:27.099055 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 16 12:45:27.099061 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Dec 16 12:45:27.099068 kernel: Freeing SMP alternatives memory: 32K Dec 16 12:45:27.099074 kernel: pid_max: default: 32768 minimum: 301 Dec 16 12:45:27.099082 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 12:45:27.099089 kernel: landlock: Up and running. Dec 16 12:45:27.099095 kernel: SELinux: Initializing. Dec 16 12:45:27.099102 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 16 12:45:27.099108 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 16 12:45:27.099115 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Dec 16 12:45:27.099123 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Dec 16 12:45:27.099137 kernel: ... version: 0 Dec 16 12:45:27.099154 kernel: ... bit width: 48 Dec 16 12:45:27.099168 kernel: ... generic registers: 6 Dec 16 12:45:27.099181 kernel: ... value mask: 0000ffffffffffff Dec 16 12:45:27.099192 kernel: ... max period: 00007fffffffffff Dec 16 12:45:27.099204 kernel: ... fixed-purpose events: 0 Dec 16 12:45:27.099216 kernel: ... event mask: 000000000000003f Dec 16 12:45:27.099231 kernel: signal: max sigframe size: 1776 Dec 16 12:45:27.099242 kernel: rcu: Hierarchical SRCU implementation. Dec 16 12:45:27.099261 kernel: rcu: Max phase no-delay instances is 400. Dec 16 12:45:27.099270 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 12:45:27.099277 kernel: smp: Bringing up secondary CPUs ... Dec 16 12:45:27.099284 kernel: smpboot: x86: Booting SMP configuration: Dec 16 12:45:27.099290 kernel: .... node #0, CPUs: #1 Dec 16 12:45:27.099300 kernel: smp: Brought up 1 node, 2 CPUs Dec 16 12:45:27.099306 kernel: smpboot: Total of 2 processors activated (9781.61 BogoMIPS) Dec 16 12:45:27.099314 kernel: Memory: 1934164K/2047464K available (14336K kernel code, 2444K rwdata, 31636K rodata, 15556K init, 2484K bss, 108756K reserved, 0K cma-reserved) Dec 16 12:45:27.099320 kernel: devtmpfs: initialized Dec 16 12:45:27.099327 kernel: x86/mm: Memory block size: 128MB Dec 16 12:45:27.099333 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 12:45:27.099340 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 16 12:45:27.099348 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 12:45:27.099354 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 12:45:27.099361 kernel: audit: initializing netlink subsys (disabled) Dec 16 12:45:27.099368 kernel: audit: type=2000 audit(1765889123.333:1): state=initialized audit_enabled=0 res=1 Dec 16 12:45:27.099374 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 12:45:27.099381 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 16 12:45:27.099387 kernel: cpuidle: using governor menu Dec 16 12:45:27.099395 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 12:45:27.099403 kernel: dca service started, version 1.12.1 Dec 16 12:45:27.099414 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Dec 16 12:45:27.099428 kernel: PCI: Using configuration type 1 for base access Dec 16 12:45:27.099445 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 16 12:45:27.099458 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 12:45:27.099470 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 12:45:27.099485 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 12:45:27.099508 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 12:45:27.099520 kernel: ACPI: Added _OSI(Module Device) Dec 16 12:45:27.099551 kernel: ACPI: Added _OSI(Processor Device) Dec 16 12:45:27.099565 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 12:45:27.099573 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 12:45:27.099579 kernel: ACPI: Interpreter enabled Dec 16 12:45:27.099586 kernel: ACPI: PM: (supports S0 S5) Dec 16 12:45:27.099596 kernel: ACPI: Using IOAPIC for interrupt routing Dec 16 12:45:27.099602 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 16 12:45:27.099609 kernel: PCI: Using E820 reservations for host bridge windows Dec 16 12:45:27.099615 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Dec 16 12:45:27.099622 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 12:45:27.099804 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 12:45:27.099938 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Dec 16 12:45:27.100102 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Dec 16 12:45:27.100119 kernel: PCI host bridge to bus 0000:00 Dec 16 12:45:27.100250 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 16 12:45:27.100384 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 16 12:45:27.101000 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 16 12:45:27.101163 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Dec 16 12:45:27.101269 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Dec 16 12:45:27.101420 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Dec 16 12:45:27.101566 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 12:45:27.101720 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Dec 16 12:45:27.101906 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Dec 16 12:45:27.102041 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Dec 16 12:45:27.102180 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Dec 16 12:45:27.102337 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Dec 16 12:45:27.102474 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Dec 16 12:45:27.102653 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 16 12:45:27.108178 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:45:27.108345 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Dec 16 12:45:27.108564 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:45:27.108702 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Dec 16 12:45:27.108883 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Dec 16 12:45:27.109064 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:45:27.109165 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Dec 16 12:45:27.109312 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:45:27.109408 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Dec 16 12:45:27.109667 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 12:45:27.109837 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:45:27.109992 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Dec 16 12:45:27.110082 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:45:27.110227 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Dec 16 12:45:27.110317 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 12:45:27.110481 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:45:27.110612 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Dec 16 12:45:27.110782 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:45:27.110879 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Dec 16 12:45:27.111028 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 12:45:27.111126 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:45:27.111273 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Dec 16 12:45:27.111365 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:45:27.112393 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Dec 16 12:45:27.112570 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 12:45:27.112690 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:45:27.112851 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Dec 16 12:45:27.113594 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:45:27.113762 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Dec 16 12:45:27.113861 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 12:45:27.114017 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:45:27.114110 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Dec 16 12:45:27.114259 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:45:27.114346 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Dec 16 12:45:27.114515 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 12:45:27.115663 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:45:27.115831 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Dec 16 12:45:27.115923 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:45:27.116024 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Dec 16 12:45:27.116650 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 12:45:27.116868 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:45:27.117066 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Dec 16 12:45:27.117180 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:45:27.117308 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 16 12:45:27.117451 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 12:45:27.118637 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Dec 16 12:45:27.118748 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Dec 16 12:45:27.118913 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Dec 16 12:45:27.119063 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Dec 16 12:45:27.119218 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Dec 16 12:45:27.119395 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Dec 16 12:45:27.120639 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Dec 16 12:45:27.120863 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:45:27.121037 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Dec 16 12:45:27.121201 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Dec 16 12:45:27.121355 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Dec 16 12:45:27.121571 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:45:27.121770 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 12:45:27.121933 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Dec 16 12:45:27.122106 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:45:27.122283 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Dec 16 12:45:27.122448 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Dec 16 12:45:27.123126 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Dec 16 12:45:27.123220 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:45:27.123379 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:45:27.123474 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Dec 16 12:45:27.123654 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:45:27.123780 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:45:27.123939 kernel: pci 0000:05:00.0: BAR 1 [mem 0xfe000000-0xfe000fff] Dec 16 12:45:27.124026 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Dec 16 12:45:27.124200 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:45:27.124358 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Dec 16 12:45:27.124450 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Dec 16 12:45:27.126432 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Dec 16 12:45:27.126642 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:45:27.126658 kernel: acpiphp: Slot [0] registered Dec 16 12:45:27.129622 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:45:27.129850 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Dec 16 12:45:27.130032 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Dec 16 12:45:27.130282 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Dec 16 12:45:27.130436 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:45:27.130458 kernel: acpiphp: Slot [0-2] registered Dec 16 12:45:27.130633 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:45:27.130655 kernel: acpiphp: Slot [0-3] registered Dec 16 12:45:27.130825 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:45:27.130854 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 16 12:45:27.130866 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 16 12:45:27.130878 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 16 12:45:27.130890 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 16 12:45:27.130902 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Dec 16 12:45:27.130914 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Dec 16 12:45:27.130926 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Dec 16 12:45:27.130937 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Dec 16 12:45:27.130953 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Dec 16 12:45:27.130966 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Dec 16 12:45:27.130978 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Dec 16 12:45:27.130989 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Dec 16 12:45:27.131001 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Dec 16 12:45:27.131012 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Dec 16 12:45:27.131024 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Dec 16 12:45:27.131038 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Dec 16 12:45:27.131049 kernel: iommu: Default domain type: Translated Dec 16 12:45:27.131066 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 16 12:45:27.131080 kernel: PCI: Using ACPI for IRQ routing Dec 16 12:45:27.131094 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 16 12:45:27.131111 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Dec 16 12:45:27.131125 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Dec 16 12:45:27.131284 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Dec 16 12:45:27.131467 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Dec 16 12:45:27.131672 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 16 12:45:27.131691 kernel: vgaarb: loaded Dec 16 12:45:27.131704 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Dec 16 12:45:27.131720 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Dec 16 12:45:27.131735 kernel: clocksource: Switched to clocksource kvm-clock Dec 16 12:45:27.131759 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 12:45:27.131773 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 12:45:27.131788 kernel: pnp: PnP ACPI init Dec 16 12:45:27.131958 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Dec 16 12:45:27.131979 kernel: pnp: PnP ACPI: found 5 devices Dec 16 12:45:27.131993 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 16 12:45:27.132008 kernel: NET: Registered PF_INET protocol family Dec 16 12:45:27.132021 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:45:27.132032 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Dec 16 12:45:27.132045 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 12:45:27.132060 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Dec 16 12:45:27.132076 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Dec 16 12:45:27.132085 kernel: TCP: Hash tables configured (established 16384 bind 16384) Dec 16 12:45:27.132099 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 16 12:45:27.132113 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 16 12:45:27.132130 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 12:45:27.132143 kernel: NET: Registered PF_XDP protocol family Dec 16 12:45:27.132293 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 12:45:27.132455 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 12:45:27.132733 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 12:45:27.132902 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:45:27.133063 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:45:27.133228 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:45:27.133401 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:45:27.133672 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Dec 16 12:45:27.133926 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Dec 16 12:45:27.134115 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:45:27.134262 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Dec 16 12:45:27.134401 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 12:45:27.134625 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:45:27.134793 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Dec 16 12:45:27.134940 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 12:45:27.135110 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:45:27.135253 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Dec 16 12:45:27.135399 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 12:45:27.135603 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:45:27.135770 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Dec 16 12:45:27.135920 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 12:45:27.136088 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:45:27.136250 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Dec 16 12:45:27.136410 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 12:45:27.136614 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:45:27.136763 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Dec 16 12:45:27.136919 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Dec 16 12:45:27.137088 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 12:45:27.137182 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:45:27.137319 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Dec 16 12:45:27.137416 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Dec 16 12:45:27.137611 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 12:45:27.137710 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:45:27.137863 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Dec 16 12:45:27.138047 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 16 12:45:27.138186 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 12:45:27.138372 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 16 12:45:27.138521 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 16 12:45:27.138689 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 16 12:45:27.138839 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Dec 16 12:45:27.139309 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Dec 16 12:45:27.139460 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Dec 16 12:45:27.139697 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Dec 16 12:45:27.139841 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Dec 16 12:45:27.139969 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Dec 16 12:45:27.140132 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 12:45:27.140259 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Dec 16 12:45:27.140412 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 12:45:27.140624 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Dec 16 12:45:27.140729 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 12:45:27.140894 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Dec 16 12:45:27.141043 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 12:45:27.141261 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Dec 16 12:45:27.141367 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 12:45:27.141561 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Dec 16 12:45:27.141650 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Dec 16 12:45:27.141785 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 12:45:27.141912 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Dec 16 12:45:27.142071 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Dec 16 12:45:27.142175 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 12:45:27.142289 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Dec 16 12:45:27.142427 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Dec 16 12:45:27.142607 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 12:45:27.142628 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Dec 16 12:45:27.142642 kernel: PCI: CLS 0 bytes, default 64 Dec 16 12:45:27.142661 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns Dec 16 12:45:27.142681 kernel: Initialise system trusted keyrings Dec 16 12:45:27.142700 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Dec 16 12:45:27.142743 kernel: Key type asymmetric registered Dec 16 12:45:27.142767 kernel: Asymmetric key parser 'x509' registered Dec 16 12:45:27.142785 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 16 12:45:27.142804 kernel: io scheduler mq-deadline registered Dec 16 12:45:27.142813 kernel: io scheduler kyber registered Dec 16 12:45:27.142821 kernel: io scheduler bfq registered Dec 16 12:45:27.142978 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Dec 16 12:45:27.143070 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Dec 16 12:45:27.143213 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Dec 16 12:45:27.143318 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Dec 16 12:45:27.143399 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Dec 16 12:45:27.143574 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Dec 16 12:45:27.143664 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Dec 16 12:45:27.143817 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Dec 16 12:45:27.143906 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Dec 16 12:45:27.144078 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Dec 16 12:45:27.144173 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Dec 16 12:45:27.144296 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Dec 16 12:45:27.144409 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Dec 16 12:45:27.144489 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Dec 16 12:45:27.144864 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Dec 16 12:45:27.145073 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Dec 16 12:45:27.145090 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Dec 16 12:45:27.145258 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Dec 16 12:45:27.145342 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Dec 16 12:45:27.145356 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 16 12:45:27.145375 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Dec 16 12:45:27.145394 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 12:45:27.145410 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 16 12:45:27.145424 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 16 12:45:27.145442 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 16 12:45:27.145453 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 16 12:45:27.145596 kernel: rtc_cmos 00:03: RTC can wake from S4 Dec 16 12:45:27.145613 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 16 12:45:27.145721 kernel: rtc_cmos 00:03: registered as rtc0 Dec 16 12:45:27.145882 kernel: rtc_cmos 00:03: setting system clock to 2025-12-16T12:45:25 UTC (1765889125) Dec 16 12:45:27.146018 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Dec 16 12:45:27.146036 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Dec 16 12:45:27.146050 kernel: NET: Registered PF_INET6 protocol family Dec 16 12:45:27.146067 kernel: Segment Routing with IPv6 Dec 16 12:45:27.146079 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 12:45:27.146091 kernel: NET: Registered PF_PACKET protocol family Dec 16 12:45:27.146103 kernel: Key type dns_resolver registered Dec 16 12:45:27.146120 kernel: IPI shorthand broadcast: enabled Dec 16 12:45:27.146135 kernel: sched_clock: Marking stable (2197046219, 255248765)->(2490753554, -38458570) Dec 16 12:45:27.146142 kernel: registered taskstats version 1 Dec 16 12:45:27.146155 kernel: Loading compiled-in X.509 certificates Dec 16 12:45:27.146168 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: aafd1eb27ea805b8231c3bede9210239fae84df8' Dec 16 12:45:27.146185 kernel: Demotion targets for Node 0: null Dec 16 12:45:27.146199 kernel: Key type .fscrypt registered Dec 16 12:45:27.146213 kernel: Key type fscrypt-provisioning registered Dec 16 12:45:27.146225 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 12:45:27.146237 kernel: ima: Allocated hash algorithm: sha1 Dec 16 12:45:27.146253 kernel: ima: No architecture policies found Dec 16 12:45:27.146266 kernel: clk: Disabling unused clocks Dec 16 12:45:27.146278 kernel: Freeing unused kernel image (initmem) memory: 15556K Dec 16 12:45:27.146291 kernel: Write protecting the kernel read-only data: 47104k Dec 16 12:45:27.146304 kernel: Freeing unused kernel image (rodata/data gap) memory: 1132K Dec 16 12:45:27.146316 kernel: Run /init as init process Dec 16 12:45:27.146329 kernel: with arguments: Dec 16 12:45:27.146342 kernel: /init Dec 16 12:45:27.146358 kernel: with environment: Dec 16 12:45:27.146373 kernel: HOME=/ Dec 16 12:45:27.146389 kernel: TERM=linux Dec 16 12:45:27.146398 kernel: ACPI: bus type USB registered Dec 16 12:45:27.146405 kernel: usbcore: registered new interface driver usbfs Dec 16 12:45:27.146413 kernel: usbcore: registered new interface driver hub Dec 16 12:45:27.146425 kernel: usbcore: registered new device driver usb Dec 16 12:45:27.146643 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:45:27.146833 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 12:45:27.146977 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 12:45:27.147129 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:45:27.147339 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 12:45:27.147569 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 12:45:27.147793 kernel: hub 1-0:1.0: USB hub found Dec 16 12:45:27.147964 kernel: hub 1-0:1.0: 4 ports detected Dec 16 12:45:27.148146 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 12:45:27.148324 kernel: hub 2-0:1.0: USB hub found Dec 16 12:45:27.148489 kernel: hub 2-0:1.0: 4 ports detected Dec 16 12:45:27.148544 kernel: SCSI subsystem initialized Dec 16 12:45:27.148558 kernel: libata version 3.00 loaded. Dec 16 12:45:27.148737 kernel: ahci 0000:00:1f.2: version 3.0 Dec 16 12:45:27.148760 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Dec 16 12:45:27.148904 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Dec 16 12:45:27.149042 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Dec 16 12:45:27.149206 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Dec 16 12:45:27.149381 kernel: scsi host0: ahci Dec 16 12:45:27.149601 kernel: scsi host1: ahci Dec 16 12:45:27.149768 kernel: scsi host2: ahci Dec 16 12:45:27.149926 kernel: scsi host3: ahci Dec 16 12:45:27.150102 kernel: scsi host4: ahci Dec 16 12:45:27.150280 kernel: scsi host5: ahci Dec 16 12:45:27.150303 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 38 lpm-pol 1 Dec 16 12:45:27.150317 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 38 lpm-pol 1 Dec 16 12:45:27.150329 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 38 lpm-pol 1 Dec 16 12:45:27.150342 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 38 lpm-pol 1 Dec 16 12:45:27.150355 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 38 lpm-pol 1 Dec 16 12:45:27.150372 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 38 lpm-pol 1 Dec 16 12:45:27.150617 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 12:45:27.150641 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 12:45:27.150656 kernel: ata4: SATA link down (SStatus 0 SControl 300) Dec 16 12:45:27.150668 kernel: ata6: SATA link down (SStatus 0 SControl 300) Dec 16 12:45:27.150683 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Dec 16 12:45:27.150699 kernel: ata1.00: LPM support broken, forcing max_power Dec 16 12:45:27.150711 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Dec 16 12:45:27.150724 kernel: ata1.00: applying bridge limits Dec 16 12:45:27.150736 kernel: ata3: SATA link down (SStatus 0 SControl 300) Dec 16 12:45:27.150749 kernel: ata2: SATA link down (SStatus 0 SControl 300) Dec 16 12:45:27.150762 kernel: ata1.00: LPM support broken, forcing max_power Dec 16 12:45:27.150774 kernel: ata1.00: configured for UDMA/100 Dec 16 12:45:27.150795 kernel: ata5: SATA link down (SStatus 0 SControl 300) Dec 16 12:45:27.150940 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 16 12:45:27.150956 kernel: usbcore: registered new interface driver usbhid Dec 16 12:45:27.150964 kernel: usbhid: USB HID core driver Dec 16 12:45:27.151126 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Dec 16 12:45:27.151224 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Dec 16 12:45:27.151246 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 16 12:45:27.151418 kernel: scsi host6: Virtio SCSI HBA Dec 16 12:45:27.151616 kernel: scsi 6:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Dec 16 12:45:27.151754 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Dec 16 12:45:27.151929 kernel: sd 6:0:0:0: Power-on or device reset occurred Dec 16 12:45:27.152028 kernel: sd 6:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Dec 16 12:45:27.152201 kernel: sd 6:0:0:0: [sda] Write Protect is off Dec 16 12:45:27.152219 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Dec 16 12:45:27.152371 kernel: sd 6:0:0:0: [sda] Mode Sense: 63 00 00 08 Dec 16 12:45:27.152550 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 12:45:27.152681 kernel: sd 6:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Dec 16 12:45:27.152706 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 12:45:27.152724 kernel: GPT:25804799 != 80003071 Dec 16 12:45:27.152736 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 12:45:27.152747 kernel: GPT:25804799 != 80003071 Dec 16 12:45:27.152757 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 12:45:27.152769 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 16 12:45:27.152892 kernel: sd 6:0:0:0: [sda] Attached SCSI disk Dec 16 12:45:27.152909 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 12:45:27.152917 kernel: device-mapper: uevent: version 1.0.3 Dec 16 12:45:27.152924 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 12:45:27.152931 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Dec 16 12:45:27.152938 kernel: raid6: avx2x4 gen() 33118 MB/s Dec 16 12:45:27.152945 kernel: raid6: avx2x2 gen() 35762 MB/s Dec 16 12:45:27.152953 kernel: raid6: avx2x1 gen() 28423 MB/s Dec 16 12:45:27.152960 kernel: raid6: using algorithm avx2x2 gen() 35762 MB/s Dec 16 12:45:27.152968 kernel: raid6: .... xor() 31853 MB/s, rmw enabled Dec 16 12:45:27.152976 kernel: raid6: using avx2x2 recovery algorithm Dec 16 12:45:27.152983 kernel: xor: automatically using best checksumming function avx Dec 16 12:45:27.152990 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 12:45:27.153001 kernel: BTRFS: device fsid 57a8262f-2900-48ba-a17e-aafbd70d59c7 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (185) Dec 16 12:45:27.153013 kernel: BTRFS info (device dm-0): first mount of filesystem 57a8262f-2900-48ba-a17e-aafbd70d59c7 Dec 16 12:45:27.153031 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 16 12:45:27.153051 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 16 12:45:27.153064 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 12:45:27.153081 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 12:45:27.153097 kernel: loop: module loaded Dec 16 12:45:27.153105 kernel: loop0: detected capacity change from 0 to 100528 Dec 16 12:45:27.153113 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 12:45:27.153121 systemd[1]: Successfully made /usr/ read-only. Dec 16 12:45:27.153136 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:45:27.153144 systemd[1]: Detected virtualization kvm. Dec 16 12:45:27.153151 systemd[1]: Detected architecture x86-64. Dec 16 12:45:27.153159 systemd[1]: Running in initrd. Dec 16 12:45:27.153166 systemd[1]: No hostname configured, using default hostname. Dec 16 12:45:27.153175 systemd[1]: Hostname set to . Dec 16 12:45:27.153182 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:45:27.153189 systemd[1]: Queued start job for default target initrd.target. Dec 16 12:45:27.153197 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:45:27.153204 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:45:27.153212 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:45:27.153220 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 12:45:27.153229 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:45:27.153238 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 12:45:27.153250 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 12:45:27.153264 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:45:27.153283 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:45:27.153303 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:45:27.153320 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:45:27.153337 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:45:27.153345 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:45:27.153353 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:45:27.153361 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:45:27.153368 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:45:27.153380 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:45:27.153387 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 12:45:27.153394 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 12:45:27.153402 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:45:27.153409 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:45:27.153417 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:45:27.153424 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:45:27.153433 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 12:45:27.153441 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 12:45:27.153448 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:45:27.153456 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 12:45:27.153464 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 12:45:27.153473 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 12:45:27.153482 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:45:27.153519 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:45:27.153575 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:45:27.153595 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 12:45:27.153612 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:45:27.153620 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 12:45:27.153654 systemd-journald[322]: Collecting audit messages is enabled. Dec 16 12:45:27.153687 kernel: audit: type=1130 audit(1765889127.093:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.153703 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:45:27.153723 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 12:45:27.153737 kernel: Bridge firewalling registered Dec 16 12:45:27.153750 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:45:27.153763 systemd-journald[322]: Journal started Dec 16 12:45:27.153796 systemd-journald[322]: Runtime Journal (/run/log/journal/59c4bfc1f79a432983f262831beaf235) is 4.7M, max 38.1M, 33.4M free. Dec 16 12:45:27.093000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.138477 systemd-modules-load[324]: Inserted module 'br_netfilter' Dec 16 12:45:27.229245 kernel: audit: type=1130 audit(1765889127.219:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.229270 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:45:27.219000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.229000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.235718 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:45:27.243045 kernel: audit: type=1130 audit(1765889127.229:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.243074 kernel: audit: type=1130 audit(1765889127.235:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.235000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.237150 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:45:27.251668 kernel: audit: type=1130 audit(1765889127.243:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.243000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.247568 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 12:45:27.256187 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:45:27.259610 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:45:27.262698 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:45:27.274756 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:45:27.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.282562 systemd-tmpfiles[341]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 12:45:27.286435 kernel: audit: type=1130 audit(1765889127.276:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.285575 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:45:27.293366 kernel: audit: type=1130 audit(1765889127.285:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.285000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.288052 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:45:27.300886 kernel: audit: type=1130 audit(1765889127.293:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.293000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.294334 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:45:27.308340 kernel: audit: type=1130 audit(1765889127.301:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.304628 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 12:45:27.309000 audit: BPF prog-id=6 op=LOAD Dec 16 12:45:27.318835 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:45:27.330237 dracut-cmdline[357]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=553464fdb0286a5b06b399da29ca659e521c68f08ea70a931c96ddffd00b4357 Dec 16 12:45:27.366413 systemd-resolved[358]: Positive Trust Anchors: Dec 16 12:45:27.367266 systemd-resolved[358]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:45:27.367276 systemd-resolved[358]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:45:27.367302 systemd-resolved[358]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:45:27.388806 systemd-resolved[358]: Defaulting to hostname 'linux'. Dec 16 12:45:27.390000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.389866 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:45:27.390622 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:45:27.415570 kernel: Loading iSCSI transport class v2.0-870. Dec 16 12:45:27.430562 kernel: iscsi: registered transport (tcp) Dec 16 12:45:27.452923 kernel: iscsi: registered transport (qla4xxx) Dec 16 12:45:27.452970 kernel: QLogic iSCSI HBA Driver Dec 16 12:45:27.472239 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:45:27.486650 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:45:27.487000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.489383 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:45:27.522275 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 12:45:27.522000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.524640 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 12:45:27.528652 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 12:45:27.553386 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:45:27.554000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.555000 audit: BPF prog-id=7 op=LOAD Dec 16 12:45:27.555000 audit: BPF prog-id=8 op=LOAD Dec 16 12:45:27.557626 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:45:27.580517 systemd-udevd[603]: Using default interface naming scheme 'v257'. Dec 16 12:45:27.594821 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:45:27.598000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.598944 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:45:27.599000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.602260 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 12:45:27.603000 audit: BPF prog-id=9 op=LOAD Dec 16 12:45:27.605738 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:45:27.621677 dracut-pre-trigger[698]: rd.md=0: removing MD RAID activation Dec 16 12:45:27.637849 systemd-networkd[699]: lo: Link UP Dec 16 12:45:27.637855 systemd-networkd[699]: lo: Gained carrier Dec 16 12:45:27.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.639616 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:45:27.640368 systemd[1]: Reached target network.target - Network. Dec 16 12:45:27.642000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.641923 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:45:27.644641 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:45:27.700564 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:45:27.702000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.706115 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 12:45:27.794308 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Dec 16 12:45:27.812101 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Dec 16 12:45:27.838555 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Dec 16 12:45:27.841546 kernel: cryptd: max_cpu_qlen set to 1000 Dec 16 12:45:27.849798 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 12:45:27.861261 systemd-networkd[699]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:45:27.864630 kernel: AES CTR mode by8 optimization enabled Dec 16 12:45:27.861268 systemd-networkd[699]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:45:27.864849 systemd-networkd[699]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:45:27.864852 systemd-networkd[699]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:45:27.865871 systemd-networkd[699]: eth0: Link UP Dec 16 12:45:27.867589 systemd-networkd[699]: eth1: Link UP Dec 16 12:45:27.867737 systemd-networkd[699]: eth0: Gained carrier Dec 16 12:45:27.867747 systemd-networkd[699]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:45:27.871163 systemd-networkd[699]: eth1: Gained carrier Dec 16 12:45:27.871175 systemd-networkd[699]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:45:27.873683 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Dec 16 12:45:27.887842 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 12:45:27.890183 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:45:27.891000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.890328 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:45:27.891964 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:45:27.899884 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:45:27.911776 disk-uuid[834]: Primary Header is updated. Dec 16 12:45:27.911776 disk-uuid[834]: Secondary Entries is updated. Dec 16 12:45:27.911776 disk-uuid[834]: Secondary Header is updated. Dec 16 12:45:27.916598 systemd-networkd[699]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 12:45:27.918000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:27.918691 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 12:45:27.921590 systemd-networkd[699]: eth0: DHCPv4 address 77.42.23.34/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 12:45:27.924390 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:45:27.930547 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:45:27.934870 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:45:27.943020 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 12:45:28.054910 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:45:28.056000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:28.067776 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:45:28.068000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:28.987453 disk-uuid[836]: Warning: The kernel is still using the old partition table. Dec 16 12:45:28.987453 disk-uuid[836]: The new table will be used at the next reboot or after you Dec 16 12:45:28.987453 disk-uuid[836]: run partprobe(8) or kpartx(8) Dec 16 12:45:28.987453 disk-uuid[836]: The operation has completed successfully. Dec 16 12:45:28.996370 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 12:45:29.002158 kernel: kauditd_printk_skb: 17 callbacks suppressed Dec 16 12:45:29.002193 kernel: audit: type=1130 audit(1765889128.996:28): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:28.996000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:28.996472 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 12:45:29.015844 kernel: audit: type=1131 audit(1765889128.996:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:28.996000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:28.998296 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 12:45:29.039560 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (863) Dec 16 12:45:29.039609 kernel: BTRFS info (device sda6): first mount of filesystem 7e31dbd7-b976-4d4a-a2e9-e2baacf4ad38 Dec 16 12:45:29.050910 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 12:45:29.058978 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:45:29.059023 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:45:29.063007 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:45:29.071568 kernel: BTRFS info (device sda6): last unmount of filesystem 7e31dbd7-b976-4d4a-a2e9-e2baacf4ad38 Dec 16 12:45:29.072178 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 12:45:29.072000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.075640 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 12:45:29.083095 kernel: audit: type=1130 audit(1765889129.072:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.196455 ignition[882]: Ignition 2.24.0 Dec 16 12:45:29.196472 ignition[882]: Stage: fetch-offline Dec 16 12:45:29.196851 ignition[882]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:45:29.198454 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:45:29.207635 kernel: audit: type=1130 audit(1765889129.199:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.199000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.196864 ignition[882]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:45:29.200636 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 12:45:29.196929 ignition[882]: parsed url from cmdline: "" Dec 16 12:45:29.196932 ignition[882]: no config URL provided Dec 16 12:45:29.196936 ignition[882]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:45:29.196942 ignition[882]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:45:29.196945 ignition[882]: failed to fetch config: resource requires networking Dec 16 12:45:29.197133 ignition[882]: Ignition finished successfully Dec 16 12:45:29.223996 ignition[888]: Ignition 2.24.0 Dec 16 12:45:29.224014 ignition[888]: Stage: fetch Dec 16 12:45:29.224138 ignition[888]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:45:29.224146 ignition[888]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:45:29.224211 ignition[888]: parsed url from cmdline: "" Dec 16 12:45:29.224214 ignition[888]: no config URL provided Dec 16 12:45:29.224217 ignition[888]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:45:29.224223 ignition[888]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:45:29.224250 ignition[888]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Dec 16 12:45:29.227072 ignition[888]: GET result: OK Dec 16 12:45:29.227162 ignition[888]: parsing config with SHA512: 2fdea5eb218bad585068058d3ad4dc4c70663cc3666acc1ec64b1577307b28680b974a9bf636b4d3d78e5c6fbb846379c337c5989367270bda29d5d604887cae Dec 16 12:45:29.233851 unknown[888]: fetched base config from "system" Dec 16 12:45:29.233863 unknown[888]: fetched base config from "system" Dec 16 12:45:29.234192 ignition[888]: fetch: fetch complete Dec 16 12:45:29.233867 unknown[888]: fetched user config from "hetzner" Dec 16 12:45:29.234197 ignition[888]: fetch: fetch passed Dec 16 12:45:29.235689 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 12:45:29.246622 kernel: audit: type=1130 audit(1765889129.238:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.238000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.234256 ignition[888]: Ignition finished successfully Dec 16 12:45:29.240851 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 12:45:29.261066 ignition[894]: Ignition 2.24.0 Dec 16 12:45:29.261836 ignition[894]: Stage: kargs Dec 16 12:45:29.262503 ignition[894]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:45:29.262541 ignition[894]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:45:29.263119 ignition[894]: kargs: kargs passed Dec 16 12:45:29.263148 ignition[894]: Ignition finished successfully Dec 16 12:45:29.267059 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 12:45:29.275935 kernel: audit: type=1130 audit(1765889129.267:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.268465 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 12:45:29.290082 ignition[900]: Ignition 2.24.0 Dec 16 12:45:29.290094 ignition[900]: Stage: disks Dec 16 12:45:29.291881 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 12:45:29.299384 kernel: audit: type=1130 audit(1765889129.292:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.292000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.290209 ignition[900]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:45:29.293444 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 12:45:29.290216 ignition[900]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:45:29.300209 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 12:45:29.290830 ignition[900]: disks: disks passed Dec 16 12:45:29.301655 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:45:29.290861 ignition[900]: Ignition finished successfully Dec 16 12:45:29.303214 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:45:29.304948 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:45:29.307430 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 12:45:29.337603 systemd-fsck[908]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Dec 16 12:45:29.339374 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 12:45:29.347908 kernel: audit: type=1130 audit(1765889129.340:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.340000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.343610 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 12:45:29.449562 kernel: EXT4-fs (sda9): mounted filesystem 1314c107-11a5-486b-9d52-be9f57b6bf1b r/w with ordered data mode. Quota mode: none. Dec 16 12:45:29.449746 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 12:45:29.450966 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 12:45:29.453733 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:45:29.456161 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 12:45:29.461018 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 16 12:45:29.464651 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 12:45:29.465647 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:45:29.467823 systemd-networkd[699]: eth1: Gained IPv6LL Dec 16 12:45:29.468714 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 12:45:29.471663 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 12:45:29.485142 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (916) Dec 16 12:45:29.485179 kernel: BTRFS info (device sda6): first mount of filesystem 7e31dbd7-b976-4d4a-a2e9-e2baacf4ad38 Dec 16 12:45:29.488900 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 12:45:29.500920 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:45:29.500970 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:45:29.505612 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:45:29.511169 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:45:29.547557 coreos-metadata[918]: Dec 16 12:45:29.547 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Dec 16 12:45:29.549146 coreos-metadata[918]: Dec 16 12:45:29.549 INFO Fetch successful Dec 16 12:45:29.550749 coreos-metadata[918]: Dec 16 12:45:29.549 INFO wrote hostname ci-4547-0-0-c-452f5360ea to /sysroot/etc/hostname Dec 16 12:45:29.553154 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 12:45:29.554000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.561551 kernel: audit: type=1130 audit(1765889129.554:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.628931 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 12:45:29.637270 kernel: audit: type=1130 audit(1765889129.629:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.629000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.630658 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 12:45:29.644642 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 12:45:29.650086 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 12:45:29.654204 kernel: BTRFS info (device sda6): last unmount of filesystem 7e31dbd7-b976-4d4a-a2e9-e2baacf4ad38 Dec 16 12:45:29.670335 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 12:45:29.672000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.675767 ignition[1018]: INFO : Ignition 2.24.0 Dec 16 12:45:29.677294 ignition[1018]: INFO : Stage: mount Dec 16 12:45:29.677294 ignition[1018]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:45:29.677294 ignition[1018]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:45:29.677294 ignition[1018]: INFO : mount: mount passed Dec 16 12:45:29.677294 ignition[1018]: INFO : Ignition finished successfully Dec 16 12:45:29.686000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:29.678492 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 12:45:29.688469 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 12:45:29.787731 systemd-networkd[699]: eth0: Gained IPv6LL Dec 16 12:45:30.451797 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:45:30.487614 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1029) Dec 16 12:45:30.492017 kernel: BTRFS info (device sda6): first mount of filesystem 7e31dbd7-b976-4d4a-a2e9-e2baacf4ad38 Dec 16 12:45:30.492067 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 12:45:30.500962 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:45:30.501024 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:45:30.505163 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:45:30.507285 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:45:30.532908 ignition[1046]: INFO : Ignition 2.24.0 Dec 16 12:45:30.532908 ignition[1046]: INFO : Stage: files Dec 16 12:45:30.535056 ignition[1046]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:45:30.535056 ignition[1046]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:45:30.535056 ignition[1046]: DEBUG : files: compiled without relabeling support, skipping Dec 16 12:45:30.538420 ignition[1046]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 12:45:30.538420 ignition[1046]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 12:45:30.541176 ignition[1046]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 12:45:30.541176 ignition[1046]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 12:45:30.541176 ignition[1046]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 12:45:30.540760 unknown[1046]: wrote ssh authorized keys file for user: core Dec 16 12:45:30.545449 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 16 12:45:30.545449 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Dec 16 12:45:30.767290 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 12:45:31.241117 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 16 12:45:31.241117 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:45:31.244126 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 12:45:31.255971 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 12:45:31.255971 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 12:45:31.255971 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-x86-64.raw: attempt #1 Dec 16 12:45:31.846026 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 12:45:33.133878 ignition[1046]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 12:45:33.133878 ignition[1046]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 12:45:33.139338 ignition[1046]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:45:33.143482 ignition[1046]: INFO : files: files passed Dec 16 12:45:33.143482 ignition[1046]: INFO : Ignition finished successfully Dec 16 12:45:33.150000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.180000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.180000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.143735 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 12:45:33.154741 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 12:45:33.159775 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 12:45:33.178220 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 12:45:33.178365 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 12:45:33.193349 initrd-setup-root-after-ignition[1077]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:45:33.193349 initrd-setup-root-after-ignition[1077]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:45:33.197181 initrd-setup-root-after-ignition[1081]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:45:33.197972 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:45:33.200000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.200811 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 12:45:33.205759 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 12:45:33.275826 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 12:45:33.275911 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 12:45:33.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.276000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.277674 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 12:45:33.279224 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 12:45:33.281134 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 12:45:33.282647 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 12:45:33.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.321033 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:45:33.324255 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 12:45:33.347817 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:45:33.348051 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:45:33.349094 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:45:33.350934 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 12:45:33.353000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.352688 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 12:45:33.352847 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:45:33.354893 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 12:45:33.356078 systemd[1]: Stopped target basic.target - Basic System. Dec 16 12:45:33.357661 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 12:45:33.365067 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:45:33.366572 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 12:45:33.367996 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:45:33.369693 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 12:45:33.371236 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:45:33.372881 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 12:45:33.374348 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 12:45:33.378000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.375934 systemd[1]: Stopped target swap.target - Swaps. Dec 16 12:45:33.377399 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 12:45:33.377502 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:45:33.379397 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:45:33.384000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.380596 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:45:33.386000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.381921 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 12:45:33.387000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.382019 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:45:33.389000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.383427 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 12:45:33.383601 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 12:45:33.385505 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 12:45:33.385703 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:45:33.386699 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 12:45:33.386833 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 12:45:33.398000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.388220 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 16 12:45:33.399000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.388357 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 12:45:33.401000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.390362 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 12:45:33.395399 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 12:45:33.396354 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 12:45:33.396451 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:45:33.408000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.408000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.398624 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 12:45:33.398708 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:45:33.400298 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 12:45:33.401519 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:45:33.407147 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 12:45:33.407662 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 12:45:33.417558 ignition[1101]: INFO : Ignition 2.24.0 Dec 16 12:45:33.417558 ignition[1101]: INFO : Stage: umount Dec 16 12:45:33.417558 ignition[1101]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:45:33.417558 ignition[1101]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:45:33.422860 ignition[1101]: INFO : umount: umount passed Dec 16 12:45:33.422860 ignition[1101]: INFO : Ignition finished successfully Dec 16 12:45:33.422948 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 12:45:33.425000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.423668 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 12:45:33.426000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.423800 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 12:45:33.428000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.426108 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 12:45:33.429000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.426143 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 12:45:33.427608 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 12:45:33.432000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.427653 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 12:45:33.428864 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 12:45:33.428898 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 12:45:33.430134 systemd[1]: Stopped target network.target - Network. Dec 16 12:45:33.431358 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 12:45:33.431415 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:45:33.432717 systemd[1]: Stopped target paths.target - Path Units. Dec 16 12:45:33.433946 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 12:45:33.435616 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:45:33.436662 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 12:45:33.444000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.438010 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 12:45:33.445000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.439375 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 12:45:33.439403 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:45:33.449000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.440758 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 12:45:33.440782 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:45:33.451000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.441960 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Dec 16 12:45:33.453000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.441979 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:45:33.455000 audit: BPF prog-id=6 op=UNLOAD Dec 16 12:45:33.443342 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 12:45:33.443379 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 12:45:33.457000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.444792 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 12:45:33.444824 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 12:45:33.446091 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 12:45:33.447578 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 12:45:33.462000 audit: BPF prog-id=9 op=UNLOAD Dec 16 12:45:33.449311 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 12:45:33.449379 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 12:45:33.466000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.450297 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 12:45:33.467000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.450351 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 12:45:33.469000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.452691 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 12:45:33.452761 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 12:45:33.456831 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 12:45:33.456932 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 12:45:33.459229 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 12:45:33.460595 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 12:45:33.460629 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:45:33.462489 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 12:45:33.464245 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 12:45:33.464287 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:45:33.466872 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 12:45:33.466906 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:45:33.485000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.468255 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 12:45:33.468287 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 12:45:33.469624 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:45:33.496000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.483778 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 12:45:33.498000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.483871 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:45:33.499000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.486755 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 12:45:33.486799 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 12:45:33.494219 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 12:45:33.494244 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:45:33.505000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.506000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.495613 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 12:45:33.507000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.495649 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:45:33.508000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.497758 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 12:45:33.497794 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 12:45:33.499227 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 12:45:33.499264 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:45:33.511000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.514000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.501589 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 12:45:33.505291 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 12:45:33.517000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.517000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:33.505333 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:45:33.506111 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 12:45:33.506146 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:45:33.506871 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 12:45:33.506903 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:45:33.508257 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 12:45:33.508288 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:45:33.509591 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:45:33.509624 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:45:33.513028 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 12:45:33.513099 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 12:45:33.516697 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 12:45:33.516764 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 12:45:33.518492 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 12:45:33.520562 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 12:45:33.535867 systemd[1]: Switching root. Dec 16 12:45:33.576038 systemd-journald[322]: Journal stopped Dec 16 12:45:34.571771 systemd-journald[322]: Received SIGTERM from PID 1 (systemd). Dec 16 12:45:34.571835 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 12:45:34.571851 kernel: SELinux: policy capability open_perms=1 Dec 16 12:45:34.571863 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 12:45:34.571874 kernel: SELinux: policy capability always_check_network=0 Dec 16 12:45:34.571883 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 12:45:34.571891 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 12:45:34.571903 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 12:45:34.571912 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 12:45:34.571921 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 12:45:34.571931 systemd[1]: Successfully loaded SELinux policy in 63.713ms. Dec 16 12:45:34.571944 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.160ms. Dec 16 12:45:34.571955 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:45:34.571964 systemd[1]: Detected virtualization kvm. Dec 16 12:45:34.571975 systemd[1]: Detected architecture x86-64. Dec 16 12:45:34.571986 systemd[1]: Detected first boot. Dec 16 12:45:34.571996 systemd[1]: Hostname set to . Dec 16 12:45:34.572005 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:45:34.572016 zram_generator::config[1144]: No configuration found. Dec 16 12:45:34.572031 kernel: Guest personality initialized and is inactive Dec 16 12:45:34.572041 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 16 12:45:34.572049 kernel: Initialized host personality Dec 16 12:45:34.572059 kernel: NET: Registered PF_VSOCK protocol family Dec 16 12:45:34.572068 systemd[1]: Populated /etc with preset unit settings. Dec 16 12:45:34.572077 kernel: kauditd_printk_skb: 54 callbacks suppressed Dec 16 12:45:34.572086 kernel: audit: type=1334 audit(1765889134.214:92): prog-id=12 op=LOAD Dec 16 12:45:34.572094 kernel: audit: type=1334 audit(1765889134.214:93): prog-id=3 op=UNLOAD Dec 16 12:45:34.572102 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 12:45:34.572111 kernel: audit: type=1334 audit(1765889134.214:94): prog-id=13 op=LOAD Dec 16 12:45:34.572121 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 12:45:34.572129 kernel: audit: type=1334 audit(1765889134.214:95): prog-id=14 op=LOAD Dec 16 12:45:34.572138 kernel: audit: type=1334 audit(1765889134.214:96): prog-id=4 op=UNLOAD Dec 16 12:45:34.572146 kernel: audit: type=1334 audit(1765889134.214:97): prog-id=5 op=UNLOAD Dec 16 12:45:34.572155 kernel: audit: type=1131 audit(1765889134.215:98): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.572163 kernel: audit: type=1334 audit(1765889134.226:99): prog-id=12 op=UNLOAD Dec 16 12:45:34.572172 kernel: audit: type=1130 audit(1765889134.238:100): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.572183 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 12:45:34.572193 kernel: audit: type=1131 audit(1765889134.238:101): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.572205 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 12:45:34.572215 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 12:45:34.572225 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 12:45:34.572235 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 12:45:34.572245 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 12:45:34.572254 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 12:45:34.572263 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 12:45:34.572273 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 12:45:34.572283 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:45:34.572292 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:45:34.572302 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 12:45:34.572311 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 12:45:34.572321 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 12:45:34.572331 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:45:34.572340 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 16 12:45:34.572350 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:45:34.572359 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:45:34.572368 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 12:45:34.572377 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 12:45:34.572387 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 12:45:34.572395 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 12:45:34.572405 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:45:34.572414 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:45:34.572425 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Dec 16 12:45:34.572435 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:45:34.572446 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:45:34.572462 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 12:45:34.572480 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 12:45:34.572491 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 12:45:34.572500 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:45:34.572512 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Dec 16 12:45:34.572522 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:45:34.573496 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Dec 16 12:45:34.573510 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Dec 16 12:45:34.573520 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:45:34.573608 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:45:34.573620 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 12:45:34.573636 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 12:45:34.573654 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 12:45:34.573672 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 12:45:34.573689 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:45:34.573702 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 12:45:34.573712 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 12:45:34.573721 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 12:45:34.573735 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 12:45:34.573745 systemd[1]: Reached target machines.target - Containers. Dec 16 12:45:34.573754 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 12:45:34.573763 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:45:34.573773 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:45:34.573787 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 12:45:34.573804 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:45:34.573822 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:45:34.573835 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:45:34.573844 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 12:45:34.573854 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:45:34.573864 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 12:45:34.573873 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 12:45:34.573884 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 12:45:34.573893 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 12:45:34.573903 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 12:45:34.573912 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:45:34.573922 kernel: fuse: init (API version 7.41) Dec 16 12:45:34.573934 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:45:34.573944 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:45:34.573953 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:45:34.573964 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 12:45:34.573980 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 12:45:34.573999 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:45:34.574014 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:45:34.574024 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 12:45:34.574033 kernel: ACPI: bus type drm_connector registered Dec 16 12:45:34.574041 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 12:45:34.574050 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 12:45:34.574061 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 12:45:34.574070 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 12:45:34.574079 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 12:45:34.574088 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:45:34.574115 systemd-journald[1219]: Collecting audit messages is enabled. Dec 16 12:45:34.574139 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 12:45:34.574149 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 12:45:34.574160 systemd-journald[1219]: Journal started Dec 16 12:45:34.574180 systemd-journald[1219]: Runtime Journal (/run/log/journal/59c4bfc1f79a432983f262831beaf235) is 4.7M, max 38.1M, 33.4M free. Dec 16 12:45:34.339000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Dec 16 12:45:34.476000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.481000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.487000 audit: BPF prog-id=14 op=UNLOAD Dec 16 12:45:34.487000 audit: BPF prog-id=13 op=UNLOAD Dec 16 12:45:34.488000 audit: BPF prog-id=15 op=LOAD Dec 16 12:45:34.488000 audit: BPF prog-id=16 op=LOAD Dec 16 12:45:34.488000 audit: BPF prog-id=17 op=LOAD Dec 16 12:45:34.568000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.569000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Dec 16 12:45:34.569000 audit[1219]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=5 a1=7fff56bd3190 a2=4000 a3=0 items=0 ppid=1 pid=1219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:34.569000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Dec 16 12:45:34.206993 systemd[1]: Queued start job for default target multi-user.target. Dec 16 12:45:34.574000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.574000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.215129 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 16 12:45:34.215433 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 12:45:34.577585 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:45:34.577000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.580000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.580000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.579793 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:45:34.579933 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:45:34.580897 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:45:34.581024 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:45:34.581000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.581000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.582182 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:45:34.582303 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:45:34.582000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.582000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.583751 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 12:45:34.584000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.584000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.583870 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 12:45:34.584875 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:45:34.584991 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:45:34.585000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.585000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.586282 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:45:34.586000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.589000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.591000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.589828 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:45:34.591316 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 12:45:34.592302 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 12:45:34.592000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.605166 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:45:34.606327 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Dec 16 12:45:34.608620 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 12:45:34.610613 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 12:45:34.613624 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 12:45:34.613647 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:45:34.615373 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 12:45:34.619512 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:45:34.619737 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:45:34.625662 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 12:45:34.638675 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 12:45:34.639626 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:45:34.640642 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 12:45:34.642614 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:45:34.643770 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:45:34.645709 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 12:45:34.649396 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:45:34.651325 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 12:45:34.652000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.653321 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:45:34.654000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.656914 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 12:45:34.659247 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 12:45:34.668767 systemd-journald[1219]: Time spent on flushing to /var/log/journal/59c4bfc1f79a432983f262831beaf235 is 52.296ms for 1304 entries. Dec 16 12:45:34.668767 systemd-journald[1219]: System Journal (/var/log/journal/59c4bfc1f79a432983f262831beaf235) is 8M, max 588.1M, 580.1M free. Dec 16 12:45:34.742744 systemd-journald[1219]: Received client request to flush runtime journal. Dec 16 12:45:34.742808 kernel: loop1: detected capacity change from 0 to 8 Dec 16 12:45:34.742834 kernel: loop2: detected capacity change from 0 to 219144 Dec 16 12:45:34.677000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.692000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.739000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.675968 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 12:45:34.677810 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 12:45:34.680659 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 12:45:34.692759 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:45:34.724147 systemd-tmpfiles[1270]: ACLs are not supported, ignoring. Dec 16 12:45:34.724163 systemd-tmpfiles[1270]: ACLs are not supported, ignoring. Dec 16 12:45:34.735664 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:45:34.744000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.741081 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 12:45:34.744446 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 12:45:34.749522 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 12:45:34.750000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.769574 kernel: loop3: detected capacity change from 0 to 50784 Dec 16 12:45:34.788042 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 12:45:34.788000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.789000 audit: BPF prog-id=18 op=LOAD Dec 16 12:45:34.789000 audit: BPF prog-id=19 op=LOAD Dec 16 12:45:34.789000 audit: BPF prog-id=20 op=LOAD Dec 16 12:45:34.792817 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Dec 16 12:45:34.793000 audit: BPF prog-id=21 op=LOAD Dec 16 12:45:34.796636 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:45:34.804643 kernel: loop4: detected capacity change from 0 to 111560 Dec 16 12:45:34.802717 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:45:34.806000 audit: BPF prog-id=22 op=LOAD Dec 16 12:45:34.806000 audit: BPF prog-id=23 op=LOAD Dec 16 12:45:34.807000 audit: BPF prog-id=24 op=LOAD Dec 16 12:45:34.809712 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Dec 16 12:45:34.812000 audit: BPF prog-id=25 op=LOAD Dec 16 12:45:34.812000 audit: BPF prog-id=26 op=LOAD Dec 16 12:45:34.812000 audit: BPF prog-id=27 op=LOAD Dec 16 12:45:34.813893 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 12:45:34.834033 systemd-tmpfiles[1295]: ACLs are not supported, ignoring. Dec 16 12:45:34.834052 systemd-tmpfiles[1295]: ACLs are not supported, ignoring. Dec 16 12:45:34.843948 kernel: loop5: detected capacity change from 0 to 8 Dec 16 12:45:34.841039 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:45:34.844000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.852568 kernel: loop6: detected capacity change from 0 to 219144 Dec 16 12:45:34.881577 kernel: loop7: detected capacity change from 0 to 50784 Dec 16 12:45:34.882668 systemd-nsresourced[1296]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Dec 16 12:45:34.883790 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Dec 16 12:45:34.884000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.896041 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 12:45:34.896000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:34.907787 kernel: loop1: detected capacity change from 0 to 111560 Dec 16 12:45:34.917888 (sd-merge)[1301]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Dec 16 12:45:34.925955 (sd-merge)[1301]: Merged extensions into '/usr'. Dec 16 12:45:34.933139 systemd[1]: Reload requested from client PID 1269 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 12:45:34.933244 systemd[1]: Reloading... Dec 16 12:45:35.035581 zram_generator::config[1341]: No configuration found. Dec 16 12:45:35.043281 systemd-oomd[1293]: No swap; memory pressure usage will be degraded Dec 16 12:45:35.063622 systemd-resolved[1294]: Positive Trust Anchors: Dec 16 12:45:35.063870 systemd-resolved[1294]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:45:35.063876 systemd-resolved[1294]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:45:35.063903 systemd-resolved[1294]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:45:35.085160 systemd-resolved[1294]: Using system hostname 'ci-4547-0-0-c-452f5360ea'. Dec 16 12:45:35.235348 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 12:45:35.235742 systemd[1]: Reloading finished in 302 ms. Dec 16 12:45:35.266645 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Dec 16 12:45:35.266000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.267508 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:45:35.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.268410 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 12:45:35.268000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.269340 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 12:45:35.269000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.273409 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:45:35.281679 systemd[1]: Starting ensure-sysext.service... Dec 16 12:45:35.285000 audit: BPF prog-id=8 op=UNLOAD Dec 16 12:45:35.285000 audit: BPF prog-id=7 op=UNLOAD Dec 16 12:45:35.285000 audit: BPF prog-id=28 op=LOAD Dec 16 12:45:35.285000 audit: BPF prog-id=29 op=LOAD Dec 16 12:45:35.283305 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:45:35.290000 audit: BPF prog-id=30 op=LOAD Dec 16 12:45:35.290000 audit: BPF prog-id=25 op=UNLOAD Dec 16 12:45:35.290000 audit: BPF prog-id=31 op=LOAD Dec 16 12:45:35.290000 audit: BPF prog-id=32 op=LOAD Dec 16 12:45:35.290000 audit: BPF prog-id=26 op=UNLOAD Dec 16 12:45:35.290000 audit: BPF prog-id=27 op=UNLOAD Dec 16 12:45:35.289932 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:45:35.292000 audit: BPF prog-id=33 op=LOAD Dec 16 12:45:35.292000 audit: BPF prog-id=22 op=UNLOAD Dec 16 12:45:35.292000 audit: BPF prog-id=34 op=LOAD Dec 16 12:45:35.292000 audit: BPF prog-id=35 op=LOAD Dec 16 12:45:35.292000 audit: BPF prog-id=23 op=UNLOAD Dec 16 12:45:35.292000 audit: BPF prog-id=24 op=UNLOAD Dec 16 12:45:35.295000 audit: BPF prog-id=36 op=LOAD Dec 16 12:45:35.295000 audit: BPF prog-id=15 op=UNLOAD Dec 16 12:45:35.295000 audit: BPF prog-id=37 op=LOAD Dec 16 12:45:35.295000 audit: BPF prog-id=38 op=LOAD Dec 16 12:45:35.295000 audit: BPF prog-id=16 op=UNLOAD Dec 16 12:45:35.295000 audit: BPF prog-id=17 op=UNLOAD Dec 16 12:45:35.296000 audit: BPF prog-id=39 op=LOAD Dec 16 12:45:35.296000 audit: BPF prog-id=18 op=UNLOAD Dec 16 12:45:35.296000 audit: BPF prog-id=40 op=LOAD Dec 16 12:45:35.296000 audit: BPF prog-id=41 op=LOAD Dec 16 12:45:35.296000 audit: BPF prog-id=19 op=UNLOAD Dec 16 12:45:35.296000 audit: BPF prog-id=20 op=UNLOAD Dec 16 12:45:35.296000 audit: BPF prog-id=42 op=LOAD Dec 16 12:45:35.296000 audit: BPF prog-id=21 op=UNLOAD Dec 16 12:45:35.304445 systemd[1]: Reload requested from client PID 1388 ('systemctl') (unit ensure-sysext.service)... Dec 16 12:45:35.306580 systemd[1]: Reloading... Dec 16 12:45:35.311266 systemd-tmpfiles[1389]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 12:45:35.311318 systemd-tmpfiles[1389]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 12:45:35.311596 systemd-tmpfiles[1389]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 12:45:35.313240 systemd-tmpfiles[1389]: ACLs are not supported, ignoring. Dec 16 12:45:35.313334 systemd-tmpfiles[1389]: ACLs are not supported, ignoring. Dec 16 12:45:35.327890 systemd-tmpfiles[1389]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:45:35.327900 systemd-tmpfiles[1389]: Skipping /boot Dec 16 12:45:35.334209 systemd-tmpfiles[1389]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:45:35.334802 systemd-tmpfiles[1389]: Skipping /boot Dec 16 12:45:35.341710 systemd-udevd[1390]: Using default interface naming scheme 'v257'. Dec 16 12:45:35.380582 zram_generator::config[1417]: No configuration found. Dec 16 12:45:35.552570 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Dec 16 12:45:35.555566 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 12:45:35.563565 kernel: ACPI: button: Power Button [PWRF] Dec 16 12:45:35.620389 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 12:45:35.621922 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 16 12:45:35.622126 systemd[1]: Reloading finished in 315 ms. Dec 16 12:45:35.626736 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Dec 16 12:45:35.626981 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Dec 16 12:45:35.631268 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:45:35.632000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.633483 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:45:35.634000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.638000 audit: BPF prog-id=43 op=LOAD Dec 16 12:45:35.639000 audit: BPF prog-id=33 op=UNLOAD Dec 16 12:45:35.639000 audit: BPF prog-id=44 op=LOAD Dec 16 12:45:35.640000 audit: BPF prog-id=45 op=LOAD Dec 16 12:45:35.640000 audit: BPF prog-id=34 op=UNLOAD Dec 16 12:45:35.640000 audit: BPF prog-id=35 op=UNLOAD Dec 16 12:45:35.640000 audit: BPF prog-id=46 op=LOAD Dec 16 12:45:35.640000 audit: BPF prog-id=39 op=UNLOAD Dec 16 12:45:35.640000 audit: BPF prog-id=47 op=LOAD Dec 16 12:45:35.642000 audit: BPF prog-id=48 op=LOAD Dec 16 12:45:35.642000 audit: BPF prog-id=40 op=UNLOAD Dec 16 12:45:35.642000 audit: BPF prog-id=41 op=UNLOAD Dec 16 12:45:35.642000 audit: BPF prog-id=49 op=LOAD Dec 16 12:45:35.642000 audit: BPF prog-id=42 op=UNLOAD Dec 16 12:45:35.642000 audit: BPF prog-id=50 op=LOAD Dec 16 12:45:35.642000 audit: BPF prog-id=51 op=LOAD Dec 16 12:45:35.642000 audit: BPF prog-id=28 op=UNLOAD Dec 16 12:45:35.642000 audit: BPF prog-id=29 op=UNLOAD Dec 16 12:45:35.643000 audit: BPF prog-id=52 op=LOAD Dec 16 12:45:35.644000 audit: BPF prog-id=30 op=UNLOAD Dec 16 12:45:35.644000 audit: BPF prog-id=53 op=LOAD Dec 16 12:45:35.644000 audit: BPF prog-id=54 op=LOAD Dec 16 12:45:35.644000 audit: BPF prog-id=31 op=UNLOAD Dec 16 12:45:35.644000 audit: BPF prog-id=32 op=UNLOAD Dec 16 12:45:35.646000 audit: BPF prog-id=55 op=LOAD Dec 16 12:45:35.646000 audit: BPF prog-id=36 op=UNLOAD Dec 16 12:45:35.646000 audit: BPF prog-id=56 op=LOAD Dec 16 12:45:35.646000 audit: BPF prog-id=57 op=LOAD Dec 16 12:45:35.646000 audit: BPF prog-id=37 op=UNLOAD Dec 16 12:45:35.646000 audit: BPF prog-id=38 op=UNLOAD Dec 16 12:45:35.657596 kernel: EDAC MC: Ver: 3.0.0 Dec 16 12:45:35.694657 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Dec 16 12:45:35.695212 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:45:35.697830 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:45:35.701807 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 12:45:35.702635 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:45:35.704790 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:45:35.717621 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:45:35.720612 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:45:35.721368 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:45:35.721520 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:45:35.724768 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 12:45:35.730971 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 12:45:35.731916 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:45:35.736305 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 12:45:35.738000 audit: BPF prog-id=58 op=LOAD Dec 16 12:45:35.744412 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:45:35.752346 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 12:45:35.753427 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:45:35.759038 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:45:35.759514 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:45:35.760153 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:45:35.760269 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:45:35.760338 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:45:35.760403 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:45:35.767219 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:45:35.767391 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:45:35.770973 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:45:35.772294 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:45:35.772492 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:45:35.772688 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:45:35.772897 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 12:45:35.786000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.784601 systemd[1]: Finished ensure-sysext.service. Dec 16 12:45:35.787027 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:45:35.787195 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:45:35.789000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.789000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.790000 audit: BPF prog-id=59 op=LOAD Dec 16 12:45:35.807333 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 16 12:45:35.810215 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:45:35.811709 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:45:35.812000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.813000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.822151 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:45:35.837757 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:45:35.847307 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:45:35.847651 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:45:35.848000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.849000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.854000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.854000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.857000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.850910 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:45:35.852914 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:45:35.855199 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 12:45:35.860894 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:45:35.873000 audit[1526]: SYSTEM_BOOT pid=1526 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.886317 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Dec 16 12:45:35.890693 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Dec 16 12:45:35.889950 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 12:45:35.891000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.901230 kernel: Console: switching to colour dummy device 80x25 Dec 16 12:45:35.903080 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 12:45:35.903139 kernel: [drm] features: -context_init Dec 16 12:45:35.919619 kernel: [drm] number of scanouts: 1 Dec 16 12:45:35.925800 kernel: [drm] number of cap sets: 0 Dec 16 12:45:35.928000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:35.929211 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 12:45:35.932857 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 16 12:45:35.960388 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Dec 16 12:45:35.960619 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 12:45:35.966436 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 12:45:35.979000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Dec 16 12:45:35.979000 audit[1559]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffeacf05ce0 a2=420 a3=0 items=0 ppid=1511 pid=1559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:35.979000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:45:35.982145 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:45:35.984513 augenrules[1559]: No rules Dec 16 12:45:35.982354 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:45:36.015911 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 12:45:36.025766 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 12:45:36.026372 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 16 12:45:36.026733 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 12:45:36.030115 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:45:36.030305 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:45:36.049633 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:45:36.067445 systemd-networkd[1521]: lo: Link UP Dec 16 12:45:36.067459 systemd-networkd[1521]: lo: Gained carrier Dec 16 12:45:36.071016 systemd-networkd[1521]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:45:36.071152 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:45:36.071551 systemd-networkd[1521]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:45:36.071649 systemd[1]: Reached target network.target - Network. Dec 16 12:45:36.073603 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 12:45:36.074340 systemd-networkd[1521]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:45:36.074396 systemd-networkd[1521]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:45:36.075637 systemd-networkd[1521]: eth0: Link UP Dec 16 12:45:36.075893 systemd-networkd[1521]: eth0: Gained carrier Dec 16 12:45:36.076247 systemd-networkd[1521]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:45:36.076971 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 12:45:36.081195 systemd-networkd[1521]: eth1: Link UP Dec 16 12:45:36.082479 systemd-networkd[1521]: eth1: Gained carrier Dec 16 12:45:36.082497 systemd-networkd[1521]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:45:36.114803 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 12:45:36.121605 systemd-networkd[1521]: eth0: DHCPv4 address 77.42.23.34/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 12:45:36.122496 systemd-timesyncd[1532]: Network configuration changed, trying to establish connection. Dec 16 12:45:36.129176 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:45:36.145816 systemd-networkd[1521]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 12:45:36.147316 systemd-timesyncd[1532]: Network configuration changed, trying to establish connection. Dec 16 12:45:36.359494 ldconfig[1516]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 12:45:36.364308 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 12:45:36.365951 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 12:45:36.383367 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 12:45:36.383669 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:45:36.383868 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 12:45:36.383989 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 12:45:36.384091 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 16 12:45:36.384289 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 12:45:36.384456 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 12:45:36.384646 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Dec 16 12:45:36.387769 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Dec 16 12:45:36.388712 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 12:45:36.389751 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 12:45:36.389867 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:45:36.390902 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:45:36.394248 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 12:45:36.396701 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 12:45:36.402947 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 12:45:36.407642 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 12:45:36.409406 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 12:45:36.425293 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 12:45:36.427866 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 12:45:36.429702 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 12:45:36.431492 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:45:36.432461 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:45:36.433467 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:45:36.433658 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:45:36.435605 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 12:45:36.438650 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 12:45:36.448796 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 12:45:36.453668 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 12:45:36.456805 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 12:45:36.466124 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 12:45:36.468386 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 12:45:36.470303 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 16 12:45:36.474692 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 12:45:36.479701 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 12:45:36.482879 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Dec 16 12:45:36.490658 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 12:45:36.492321 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 12:45:36.496639 jq[1591]: false Dec 16 12:45:36.500981 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 12:45:36.507741 extend-filesystems[1592]: Found /dev/sda6 Dec 16 12:45:36.503772 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 12:45:36.519448 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Refreshing passwd entry cache Dec 16 12:45:36.519448 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Failure getting users, quitting Dec 16 12:45:36.519448 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 12:45:36.519448 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Refreshing group entry cache Dec 16 12:45:36.510435 oslogin_cache_refresh[1593]: Refreshing passwd entry cache Dec 16 12:45:36.504195 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 12:45:36.512776 oslogin_cache_refresh[1593]: Failure getting users, quitting Dec 16 12:45:36.506488 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 12:45:36.512796 oslogin_cache_refresh[1593]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 12:45:36.512369 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 12:45:36.512837 oslogin_cache_refresh[1593]: Refreshing group entry cache Dec 16 12:45:36.520257 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 12:45:36.523213 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Failure getting groups, quitting Dec 16 12:45:36.523213 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 12:45:36.523271 extend-filesystems[1592]: Found /dev/sda9 Dec 16 12:45:36.522493 oslogin_cache_refresh[1593]: Failure getting groups, quitting Dec 16 12:45:36.522737 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 12:45:36.534197 extend-filesystems[1592]: Checking size of /dev/sda9 Dec 16 12:45:36.538374 coreos-metadata[1586]: Dec 16 12:45:36.532 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Dec 16 12:45:36.538374 coreos-metadata[1586]: Dec 16 12:45:36.534 INFO Fetch successful Dec 16 12:45:36.538374 coreos-metadata[1586]: Dec 16 12:45:36.534 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Dec 16 12:45:36.538374 coreos-metadata[1586]: Dec 16 12:45:36.534 INFO Fetch successful Dec 16 12:45:36.522501 oslogin_cache_refresh[1593]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 12:45:36.524321 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 12:45:36.524880 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 16 12:45:36.525281 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 16 12:45:36.539864 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 12:45:36.540683 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 12:45:36.548457 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 12:45:36.549786 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 12:45:36.557701 jq[1611]: true Dec 16 12:45:36.562770 extend-filesystems[1592]: Resized partition /dev/sda9 Dec 16 12:45:36.569293 extend-filesystems[1630]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 12:45:36.587886 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 8410107 blocks Dec 16 12:45:36.587922 update_engine[1609]: I20251216 12:45:36.565214 1609 main.cc:92] Flatcar Update Engine starting Dec 16 12:45:36.598817 tar[1618]: linux-amd64/LICENSE Dec 16 12:45:36.617691 jq[1627]: true Dec 16 12:45:36.617775 tar[1618]: linux-amd64/helm Dec 16 12:45:36.660280 dbus-daemon[1587]: [system] SELinux support is enabled Dec 16 12:45:36.660484 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 12:45:36.666501 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 12:45:36.666580 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 12:45:36.669237 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 12:45:36.669256 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 12:45:36.708884 update_engine[1609]: I20251216 12:45:36.706291 1609 update_check_scheduler.cc:74] Next update check in 2m17s Dec 16 12:45:36.706439 systemd[1]: Started update-engine.service - Update Engine. Dec 16 12:45:36.717500 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 12:45:36.734678 kernel: EXT4-fs (sda9): resized filesystem to 8410107 Dec 16 12:45:36.725474 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 12:45:36.726355 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 12:45:36.732135 systemd-logind[1607]: New seat seat0. Dec 16 12:45:36.736474 extend-filesystems[1630]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Dec 16 12:45:36.736474 extend-filesystems[1630]: old_desc_blocks = 1, new_desc_blocks = 5 Dec 16 12:45:36.736474 extend-filesystems[1630]: The filesystem on /dev/sda9 is now 8410107 (4k) blocks long. Dec 16 12:45:36.764302 extend-filesystems[1592]: Resized filesystem in /dev/sda9 Dec 16 12:45:36.737400 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 12:45:36.738604 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 12:45:36.772572 systemd-logind[1607]: Watching system buttons on /dev/input/event3 (Power Button) Dec 16 12:45:36.773023 systemd-logind[1607]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 16 12:45:36.773296 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 12:45:36.791651 bash[1677]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:45:36.795472 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 12:45:36.806878 systemd[1]: Starting sshkeys.service... Dec 16 12:45:36.852335 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 12:45:36.859657 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 12:45:36.923557 coreos-metadata[1680]: Dec 16 12:45:36.921 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Dec 16 12:45:36.926949 coreos-metadata[1680]: Dec 16 12:45:36.926 INFO Fetch successful Dec 16 12:45:36.927873 unknown[1680]: wrote ssh authorized keys file for user: core Dec 16 12:45:36.931741 containerd[1624]: time="2025-12-16T12:45:36Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 12:45:36.932038 containerd[1624]: time="2025-12-16T12:45:36.932005732Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961197963Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.056µs" Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961236204Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961268926Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961278754Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961383741Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961397968Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961473790Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961484130Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961733808Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961759586Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961773963Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:45:36.962917 containerd[1624]: time="2025-12-16T12:45:36.961785735Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:45:36.963149 containerd[1624]: time="2025-12-16T12:45:36.961930757Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:45:36.963149 containerd[1624]: time="2025-12-16T12:45:36.961943180Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 12:45:36.963149 containerd[1624]: time="2025-12-16T12:45:36.962004856Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 12:45:36.963149 containerd[1624]: time="2025-12-16T12:45:36.962142975Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:45:36.963149 containerd[1624]: time="2025-12-16T12:45:36.962167070Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:45:36.963149 containerd[1624]: time="2025-12-16T12:45:36.962175085Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 12:45:36.963364 containerd[1624]: time="2025-12-16T12:45:36.963336774Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 12:45:36.964274 containerd[1624]: time="2025-12-16T12:45:36.964045944Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 12:45:36.964867 containerd[1624]: time="2025-12-16T12:45:36.964517579Z" level=info msg="metadata content store policy set" policy=shared Dec 16 12:45:36.969338 containerd[1624]: time="2025-12-16T12:45:36.969281443Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 12:45:36.969385 containerd[1624]: time="2025-12-16T12:45:36.969362165Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:45:36.969462 containerd[1624]: time="2025-12-16T12:45:36.969442756Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:45:36.969482 containerd[1624]: time="2025-12-16T12:45:36.969461621Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 12:45:36.970283 containerd[1624]: time="2025-12-16T12:45:36.969474235Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 12:45:36.970338 containerd[1624]: time="2025-12-16T12:45:36.970306035Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 12:45:36.970452 containerd[1624]: time="2025-12-16T12:45:36.970417975Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 12:45:36.972549 update-ssh-keys[1685]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:45:36.974655 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 12:45:36.977439 containerd[1624]: time="2025-12-16T12:45:36.977393187Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 12:45:36.977489 containerd[1624]: time="2025-12-16T12:45:36.977460654Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 12:45:36.977489 containerd[1624]: time="2025-12-16T12:45:36.977478958Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 12:45:36.977520 containerd[1624]: time="2025-12-16T12:45:36.977496310Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 12:45:36.977520 containerd[1624]: time="2025-12-16T12:45:36.977510116Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 12:45:36.978388 containerd[1624]: time="2025-12-16T12:45:36.977520977Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 12:45:36.978428 containerd[1624]: time="2025-12-16T12:45:36.978403302Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 12:45:36.979983 systemd[1]: Finished sshkeys.service. Dec 16 12:45:36.983949 containerd[1624]: time="2025-12-16T12:45:36.983913776Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 12:45:36.984015 containerd[1624]: time="2025-12-16T12:45:36.983955334Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 12:45:36.984015 containerd[1624]: time="2025-12-16T12:45:36.983979660Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 12:45:36.984015 containerd[1624]: time="2025-12-16T12:45:36.983993686Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 12:45:36.984015 containerd[1624]: time="2025-12-16T12:45:36.984006600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 12:45:36.984115 containerd[1624]: time="2025-12-16T12:45:36.984015798Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 12:45:36.984115 containerd[1624]: time="2025-12-16T12:45:36.984031968Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 12:45:36.984115 containerd[1624]: time="2025-12-16T12:45:36.984043570Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 12:45:36.984115 containerd[1624]: time="2025-12-16T12:45:36.984054721Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 12:45:36.984115 containerd[1624]: time="2025-12-16T12:45:36.984070060Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 12:45:36.984115 containerd[1624]: time="2025-12-16T12:45:36.984088985Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 12:45:36.984200 containerd[1624]: time="2025-12-16T12:45:36.984131154Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 12:45:36.984200 containerd[1624]: time="2025-12-16T12:45:36.984191247Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 12:45:36.984229 containerd[1624]: time="2025-12-16T12:45:36.984208510Z" level=info msg="Start snapshots syncer" Dec 16 12:45:36.984244 containerd[1624]: time="2025-12-16T12:45:36.984234177Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 12:45:36.989554 containerd[1624]: time="2025-12-16T12:45:36.984516457Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 12:45:36.989554 containerd[1624]: time="2025-12-16T12:45:36.988917000Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.988981321Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989113488Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989150218Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989169053Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989187227Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989202495Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989219708Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989236980Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989255786Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989271034Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989314546Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989336337Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:45:36.989682 containerd[1624]: time="2025-12-16T12:45:36.989351344Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:45:36.989863 containerd[1624]: time="2025-12-16T12:45:36.989365050Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:45:36.989863 containerd[1624]: time="2025-12-16T12:45:36.989379307Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 12:45:36.989863 containerd[1624]: time="2025-12-16T12:45:36.989395077Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 12:45:36.989863 containerd[1624]: time="2025-12-16T12:45:36.989409003Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 12:45:36.989863 containerd[1624]: time="2025-12-16T12:45:36.989431345Z" level=info msg="runtime interface created" Dec 16 12:45:36.989863 containerd[1624]: time="2025-12-16T12:45:36.989437937Z" level=info msg="created NRI interface" Dec 16 12:45:36.989863 containerd[1624]: time="2025-12-16T12:45:36.989453526Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 12:45:36.989863 containerd[1624]: time="2025-12-16T12:45:36.989468535Z" level=info msg="Connect containerd service" Dec 16 12:45:36.989863 containerd[1624]: time="2025-12-16T12:45:36.989501296Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 12:45:36.992479 containerd[1624]: time="2025-12-16T12:45:36.992449625Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:45:37.066758 locksmithd[1661]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 12:45:37.156735 containerd[1624]: time="2025-12-16T12:45:37.156100785Z" level=info msg="Start subscribing containerd event" Dec 16 12:45:37.156735 containerd[1624]: time="2025-12-16T12:45:37.156145299Z" level=info msg="Start recovering state" Dec 16 12:45:37.156735 containerd[1624]: time="2025-12-16T12:45:37.156228735Z" level=info msg="Start event monitor" Dec 16 12:45:37.156735 containerd[1624]: time="2025-12-16T12:45:37.156240557Z" level=info msg="Start cni network conf syncer for default" Dec 16 12:45:37.156735 containerd[1624]: time="2025-12-16T12:45:37.156245607Z" level=info msg="Start streaming server" Dec 16 12:45:37.156735 containerd[1624]: time="2025-12-16T12:45:37.156252830Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 12:45:37.156735 containerd[1624]: time="2025-12-16T12:45:37.156258320Z" level=info msg="runtime interface starting up..." Dec 16 12:45:37.156735 containerd[1624]: time="2025-12-16T12:45:37.156262599Z" level=info msg="starting plugins..." Dec 16 12:45:37.156735 containerd[1624]: time="2025-12-16T12:45:37.156272477Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 12:45:37.157662 containerd[1624]: time="2025-12-16T12:45:37.157515528Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 12:45:37.158050 containerd[1624]: time="2025-12-16T12:45:37.158030805Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 12:45:37.159407 containerd[1624]: time="2025-12-16T12:45:37.159109327Z" level=info msg="containerd successfully booted in 0.229283s" Dec 16 12:45:37.159286 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 12:45:37.171359 tar[1618]: linux-amd64/README.md Dec 16 12:45:37.185502 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 12:45:37.275729 systemd-networkd[1521]: eth1: Gained IPv6LL Dec 16 12:45:37.276581 systemd-timesyncd[1532]: Network configuration changed, trying to establish connection. Dec 16 12:45:37.279945 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 12:45:37.281511 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 12:45:37.286685 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:45:37.289466 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 12:45:37.291107 sshd_keygen[1631]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 12:45:37.317900 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 12:45:37.320701 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 12:45:37.324827 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 12:45:37.337927 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 12:45:37.338184 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 12:45:37.341606 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 12:45:37.357671 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 12:45:37.360702 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 12:45:37.364461 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 16 12:45:37.366244 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 12:45:37.467759 systemd-networkd[1521]: eth0: Gained IPv6LL Dec 16 12:45:37.468417 systemd-timesyncd[1532]: Network configuration changed, trying to establish connection. Dec 16 12:45:38.187561 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:45:38.188477 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 12:45:38.191176 systemd[1]: Startup finished in 3.345s (kernel) + 7.008s (initrd) + 4.510s (userspace) = 14.864s. Dec 16 12:45:38.196804 (kubelet)[1743]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:45:38.688767 kubelet[1743]: E1216 12:45:38.688678 1743 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:45:38.691045 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:45:38.691178 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:45:38.691511 systemd[1]: kubelet.service: Consumed 846ms CPU time, 256.5M memory peak. Dec 16 12:45:40.080763 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 12:45:40.082279 systemd[1]: Started sshd@0-77.42.23.34:22-147.75.109.163:46166.service - OpenSSH per-connection server daemon (147.75.109.163:46166). Dec 16 12:45:41.092267 sshd[1755]: Accepted publickey for core from 147.75.109.163 port 46166 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:45:41.094199 sshd-session[1755]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:45:41.101747 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 12:45:41.103501 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 12:45:41.111618 systemd-logind[1607]: New session 1 of user core. Dec 16 12:45:41.123883 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 12:45:41.127234 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 12:45:41.143806 (systemd)[1761]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:45:41.146989 systemd-logind[1607]: New session 2 of user core. Dec 16 12:45:41.285701 systemd[1761]: Queued start job for default target default.target. Dec 16 12:45:41.293130 systemd[1761]: Created slice app.slice - User Application Slice. Dec 16 12:45:41.293174 systemd[1761]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Dec 16 12:45:41.293192 systemd[1761]: Reached target paths.target - Paths. Dec 16 12:45:41.293254 systemd[1761]: Reached target timers.target - Timers. Dec 16 12:45:41.294863 systemd[1761]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 12:45:41.297693 systemd[1761]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Dec 16 12:45:41.308125 systemd[1761]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 12:45:41.308431 systemd[1761]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Dec 16 12:45:41.308736 systemd[1761]: Reached target sockets.target - Sockets. Dec 16 12:45:41.308790 systemd[1761]: Reached target basic.target - Basic System. Dec 16 12:45:41.308822 systemd[1761]: Reached target default.target - Main User Target. Dec 16 12:45:41.308845 systemd[1761]: Startup finished in 156ms. Dec 16 12:45:41.309039 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 12:45:41.315886 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 12:45:41.858159 systemd[1]: Started sshd@1-77.42.23.34:22-147.75.109.163:46178.service - OpenSSH per-connection server daemon (147.75.109.163:46178). Dec 16 12:45:42.742404 sshd[1775]: Accepted publickey for core from 147.75.109.163 port 46178 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:45:42.743762 sshd-session[1775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:45:42.749214 systemd-logind[1607]: New session 3 of user core. Dec 16 12:45:42.754707 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 12:45:43.240885 sshd[1779]: Connection closed by 147.75.109.163 port 46178 Dec 16 12:45:43.241599 sshd-session[1775]: pam_unix(sshd:session): session closed for user core Dec 16 12:45:43.244465 systemd[1]: sshd@1-77.42.23.34:22-147.75.109.163:46178.service: Deactivated successfully. Dec 16 12:45:43.246412 systemd-logind[1607]: Session 3 logged out. Waiting for processes to exit. Dec 16 12:45:43.246767 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 12:45:43.247970 systemd-logind[1607]: Removed session 3. Dec 16 12:45:43.457432 systemd[1]: Started sshd@2-77.42.23.34:22-147.75.109.163:47752.service - OpenSSH per-connection server daemon (147.75.109.163:47752). Dec 16 12:45:44.423412 sshd[1785]: Accepted publickey for core from 147.75.109.163 port 47752 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:45:44.424762 sshd-session[1785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:45:44.430140 systemd-logind[1607]: New session 4 of user core. Dec 16 12:45:44.435694 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 12:45:44.977932 sshd[1789]: Connection closed by 147.75.109.163 port 47752 Dec 16 12:45:44.978444 sshd-session[1785]: pam_unix(sshd:session): session closed for user core Dec 16 12:45:44.981748 systemd-logind[1607]: Session 4 logged out. Waiting for processes to exit. Dec 16 12:45:44.982412 systemd[1]: sshd@2-77.42.23.34:22-147.75.109.163:47752.service: Deactivated successfully. Dec 16 12:45:44.983876 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 12:45:44.985052 systemd-logind[1607]: Removed session 4. Dec 16 12:45:45.140067 systemd[1]: Started sshd@3-77.42.23.34:22-147.75.109.163:47764.service - OpenSSH per-connection server daemon (147.75.109.163:47764). Dec 16 12:45:46.018050 sshd[1795]: Accepted publickey for core from 147.75.109.163 port 47764 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:45:46.019436 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:45:46.024733 systemd-logind[1607]: New session 5 of user core. Dec 16 12:45:46.031698 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 12:45:46.525240 sshd[1799]: Connection closed by 147.75.109.163 port 47764 Dec 16 12:45:46.525761 sshd-session[1795]: pam_unix(sshd:session): session closed for user core Dec 16 12:45:46.529067 systemd-logind[1607]: Session 5 logged out. Waiting for processes to exit. Dec 16 12:45:46.529741 systemd[1]: sshd@3-77.42.23.34:22-147.75.109.163:47764.service: Deactivated successfully. Dec 16 12:45:46.531455 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 12:45:46.533300 systemd-logind[1607]: Removed session 5. Dec 16 12:45:46.700908 systemd[1]: Started sshd@4-77.42.23.34:22-147.75.109.163:47776.service - OpenSSH per-connection server daemon (147.75.109.163:47776). Dec 16 12:45:47.572887 sshd[1805]: Accepted publickey for core from 147.75.109.163 port 47776 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:45:47.574278 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:45:47.579925 systemd-logind[1607]: New session 6 of user core. Dec 16 12:45:47.589748 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 12:45:47.923330 sudo[1810]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 12:45:47.923740 sudo[1810]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:45:47.935673 sudo[1810]: pam_unix(sudo:session): session closed for user root Dec 16 12:45:48.101719 sshd[1809]: Connection closed by 147.75.109.163 port 47776 Dec 16 12:45:48.102558 sshd-session[1805]: pam_unix(sshd:session): session closed for user core Dec 16 12:45:48.107229 systemd[1]: sshd@4-77.42.23.34:22-147.75.109.163:47776.service: Deactivated successfully. Dec 16 12:45:48.109080 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 12:45:48.109937 systemd-logind[1607]: Session 6 logged out. Waiting for processes to exit. Dec 16 12:45:48.111148 systemd-logind[1607]: Removed session 6. Dec 16 12:45:48.316221 systemd[1]: Started sshd@5-77.42.23.34:22-147.75.109.163:47778.service - OpenSSH per-connection server daemon (147.75.109.163:47778). Dec 16 12:45:48.942028 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 12:45:48.944044 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:45:49.083101 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:45:49.088965 (kubelet)[1828]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:45:49.133616 kubelet[1828]: E1216 12:45:49.133520 1828 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:45:49.137342 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:45:49.137483 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:45:49.137862 systemd[1]: kubelet.service: Consumed 136ms CPU time, 110.3M memory peak. Dec 16 12:45:49.279639 sshd[1817]: Accepted publickey for core from 147.75.109.163 port 47778 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:45:49.281295 sshd-session[1817]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:45:49.286966 systemd-logind[1607]: New session 7 of user core. Dec 16 12:45:49.292701 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 12:45:49.653196 sudo[1838]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 12:45:49.653573 sudo[1838]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:45:49.655908 sudo[1838]: pam_unix(sudo:session): session closed for user root Dec 16 12:45:49.661870 sudo[1837]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 12:45:49.662197 sudo[1837]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:45:49.669695 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:45:49.708579 kernel: kauditd_printk_skb: 134 callbacks suppressed Dec 16 12:45:49.708698 kernel: audit: type=1305 audit(1765889149.706:232): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:45:49.706000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:45:49.708781 augenrules[1862]: No rules Dec 16 12:45:49.706000 audit[1862]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffcd9f98050 a2=420 a3=0 items=0 ppid=1843 pid=1862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:49.711854 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:45:49.712351 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:45:49.714167 sudo[1837]: pam_unix(sudo:session): session closed for user root Dec 16 12:45:49.718362 kernel: audit: type=1300 audit(1765889149.706:232): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffcd9f98050 a2=420 a3=0 items=0 ppid=1843 pid=1862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:49.718446 kernel: audit: type=1327 audit(1765889149.706:232): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:45:49.706000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:45:49.711000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:49.727224 kernel: audit: type=1130 audit(1765889149.711:233): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:49.727315 kernel: audit: type=1131 audit(1765889149.711:234): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:49.711000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:49.711000 audit[1837]: USER_END pid=1837 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:45:49.711000 audit[1837]: CRED_DISP pid=1837 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:45:49.741920 kernel: audit: type=1106 audit(1765889149.711:235): pid=1837 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:45:49.742009 kernel: audit: type=1104 audit(1765889149.711:236): pid=1837 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:45:49.899716 sshd[1836]: Connection closed by 147.75.109.163 port 47778 Dec 16 12:45:49.900278 sshd-session[1817]: pam_unix(sshd:session): session closed for user core Dec 16 12:45:49.900000 audit[1817]: USER_END pid=1817 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:45:49.904394 systemd[1]: sshd@5-77.42.23.34:22-147.75.109.163:47778.service: Deactivated successfully. Dec 16 12:45:49.906947 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 12:45:49.901000 audit[1817]: CRED_DISP pid=1817 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:45:49.908490 systemd-logind[1607]: Session 7 logged out. Waiting for processes to exit. Dec 16 12:45:49.911162 systemd-logind[1607]: Removed session 7. Dec 16 12:45:49.914799 kernel: audit: type=1106 audit(1765889149.900:237): pid=1817 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:45:49.914857 kernel: audit: type=1104 audit(1765889149.901:238): pid=1817 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:45:49.901000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-77.42.23.34:22-147.75.109.163:47778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:49.920822 kernel: audit: type=1131 audit(1765889149.901:239): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-77.42.23.34:22-147.75.109.163:47778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:50.060968 systemd[1]: Started sshd@6-77.42.23.34:22-147.75.109.163:47784.service - OpenSSH per-connection server daemon (147.75.109.163:47784). Dec 16 12:45:50.060000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.23.34:22-147.75.109.163:47784 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:50.946000 audit[1871]: USER_ACCT pid=1871 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:45:50.946925 sshd[1871]: Accepted publickey for core from 147.75.109.163 port 47784 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:45:50.947000 audit[1871]: CRED_ACQ pid=1871 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:45:50.947000 audit[1871]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffedaf699a0 a2=3 a3=0 items=0 ppid=1 pid=1871 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:50.947000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:45:50.948604 sshd-session[1871]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:45:50.955340 systemd-logind[1607]: New session 8 of user core. Dec 16 12:45:50.965828 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 12:45:50.969000 audit[1871]: USER_START pid=1871 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:45:50.971000 audit[1875]: CRED_ACQ pid=1875 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:45:51.288000 audit[1876]: USER_ACCT pid=1876 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:45:51.289069 sudo[1876]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 12:45:51.288000 audit[1876]: CRED_REFR pid=1876 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:45:51.288000 audit[1876]: USER_START pid=1876 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:45:51.289335 sudo[1876]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:45:51.640028 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 12:45:51.661814 (dockerd)[1895]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 12:45:52.021316 dockerd[1895]: time="2025-12-16T12:45:52.020564327Z" level=info msg="Starting up" Dec 16 12:45:52.024609 dockerd[1895]: time="2025-12-16T12:45:52.024560632Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 12:45:52.039222 dockerd[1895]: time="2025-12-16T12:45:52.039147530Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 12:45:52.086344 dockerd[1895]: time="2025-12-16T12:45:52.086285388Z" level=info msg="Loading containers: start." Dec 16 12:45:52.097565 kernel: Initializing XFRM netlink socket Dec 16 12:45:52.149000 audit[1945]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1945 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.149000 audit[1945]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffea611bb40 a2=0 a3=0 items=0 ppid=1895 pid=1945 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.149000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:45:52.150000 audit[1947]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1947 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.150000 audit[1947]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffc4bee0dc0 a2=0 a3=0 items=0 ppid=1895 pid=1947 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.150000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:45:52.152000 audit[1949]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1949 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.152000 audit[1949]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe865094f0 a2=0 a3=0 items=0 ppid=1895 pid=1949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.152000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:45:52.154000 audit[1951]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1951 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.154000 audit[1951]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe269a25c0 a2=0 a3=0 items=0 ppid=1895 pid=1951 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.154000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:45:52.156000 audit[1953]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=1953 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.156000 audit[1953]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffde9795110 a2=0 a3=0 items=0 ppid=1895 pid=1953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.156000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:45:52.157000 audit[1955]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=1955 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.157000 audit[1955]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffcb103c360 a2=0 a3=0 items=0 ppid=1895 pid=1955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.157000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:45:52.159000 audit[1957]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1957 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.159000 audit[1957]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff5b4d09d0 a2=0 a3=0 items=0 ppid=1895 pid=1957 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.159000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:45:52.160000 audit[1959]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=1959 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.160000 audit[1959]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffd560d4d30 a2=0 a3=0 items=0 ppid=1895 pid=1959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.160000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:45:52.185000 audit[1962]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=1962 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.185000 audit[1962]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffe2f775520 a2=0 a3=0 items=0 ppid=1895 pid=1962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.185000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Dec 16 12:45:52.187000 audit[1964]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=1964 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.187000 audit[1964]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd51380fd0 a2=0 a3=0 items=0 ppid=1895 pid=1964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.187000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:45:52.191000 audit[1966]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1966 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.191000 audit[1966]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffdf9013aa0 a2=0 a3=0 items=0 ppid=1895 pid=1966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.191000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:45:52.193000 audit[1968]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=1968 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.193000 audit[1968]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffdf7ed9c80 a2=0 a3=0 items=0 ppid=1895 pid=1968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.193000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:45:52.195000 audit[1970]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=1970 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.195000 audit[1970]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7fff80d20630 a2=0 a3=0 items=0 ppid=1895 pid=1970 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.195000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:45:52.228000 audit[2000]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2000 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.228000 audit[2000]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffd60207640 a2=0 a3=0 items=0 ppid=1895 pid=2000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.228000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:45:52.230000 audit[2002]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2002 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.230000 audit[2002]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe4f5de8d0 a2=0 a3=0 items=0 ppid=1895 pid=2002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.230000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:45:52.231000 audit[2004]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2004 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.231000 audit[2004]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc853ba240 a2=0 a3=0 items=0 ppid=1895 pid=2004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.231000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:45:52.233000 audit[2006]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2006 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.233000 audit[2006]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee40a4510 a2=0 a3=0 items=0 ppid=1895 pid=2006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.233000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:45:52.235000 audit[2008]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2008 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.235000 audit[2008]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd03702600 a2=0 a3=0 items=0 ppid=1895 pid=2008 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.235000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:45:52.237000 audit[2010]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2010 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.237000 audit[2010]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff83aefe80 a2=0 a3=0 items=0 ppid=1895 pid=2010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.237000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:45:52.238000 audit[2012]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2012 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.238000 audit[2012]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff39db4270 a2=0 a3=0 items=0 ppid=1895 pid=2012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.238000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:45:52.240000 audit[2014]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2014 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.240000 audit[2014]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffe059fd280 a2=0 a3=0 items=0 ppid=1895 pid=2014 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.240000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:45:52.242000 audit[2016]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2016 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.242000 audit[2016]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffe39f8d0c0 a2=0 a3=0 items=0 ppid=1895 pid=2016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.242000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Dec 16 12:45:52.244000 audit[2018]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2018 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.244000 audit[2018]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffec0235570 a2=0 a3=0 items=0 ppid=1895 pid=2018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.244000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:45:52.245000 audit[2020]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2020 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.245000 audit[2020]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffeb1e3a5f0 a2=0 a3=0 items=0 ppid=1895 pid=2020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.245000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:45:52.247000 audit[2022]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2022 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.247000 audit[2022]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffeeddb1ea0 a2=0 a3=0 items=0 ppid=1895 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.247000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:45:52.249000 audit[2024]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2024 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.249000 audit[2024]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffd5b3a6450 a2=0 a3=0 items=0 ppid=1895 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.249000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:45:52.253000 audit[2029]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.253000 audit[2029]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff4822c0c0 a2=0 a3=0 items=0 ppid=1895 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.253000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:45:52.255000 audit[2031]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.255000 audit[2031]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7fffa4b8b530 a2=0 a3=0 items=0 ppid=1895 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.255000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:45:52.257000 audit[2033]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.257000 audit[2033]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffcba7cf480 a2=0 a3=0 items=0 ppid=1895 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.257000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:45:52.258000 audit[2035]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2035 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.258000 audit[2035]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd28bf5d40 a2=0 a3=0 items=0 ppid=1895 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.258000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:45:52.260000 audit[2037]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2037 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.260000 audit[2037]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffe427f78b0 a2=0 a3=0 items=0 ppid=1895 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.260000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:45:52.262000 audit[2039]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2039 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:45:52.262000 audit[2039]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fffbafcc490 a2=0 a3=0 items=0 ppid=1895 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.262000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:45:52.272316 systemd-timesyncd[1532]: Network configuration changed, trying to establish connection. Dec 16 12:45:52.287000 audit[2043]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.287000 audit[2043]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffde31434e0 a2=0 a3=0 items=0 ppid=1895 pid=2043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.287000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Dec 16 12:45:52.289000 audit[2045]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.289000 audit[2045]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffe21560630 a2=0 a3=0 items=0 ppid=1895 pid=2045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.289000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Dec 16 12:45:52.296000 audit[2053]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.296000 audit[2053]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7fff2f6c1960 a2=0 a3=0 items=0 ppid=1895 pid=2053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.296000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Dec 16 12:45:52.304000 audit[2059]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2059 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.304000 audit[2059]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff38104b60 a2=0 a3=0 items=0 ppid=1895 pid=2059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.304000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Dec 16 12:45:52.305739 systemd-timesyncd[1532]: Contacted time server 5.9.19.62:123 (2.flatcar.pool.ntp.org). Dec 16 12:45:52.305897 systemd-timesyncd[1532]: Initial clock synchronization to Tue 2025-12-16 12:45:52.367878 UTC. Dec 16 12:45:52.306000 audit[2061]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2061 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.306000 audit[2061]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffce97a1750 a2=0 a3=0 items=0 ppid=1895 pid=2061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.306000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Dec 16 12:45:52.308000 audit[2063]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.308000 audit[2063]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd896e9120 a2=0 a3=0 items=0 ppid=1895 pid=2063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.308000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Dec 16 12:45:52.309000 audit[2065]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.309000 audit[2065]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffe78efb880 a2=0 a3=0 items=0 ppid=1895 pid=2065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.309000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:45:52.311000 audit[2067]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2067 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:45:52.311000 audit[2067]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe14f856a0 a2=0 a3=0 items=0 ppid=1895 pid=2067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:45:52.311000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Dec 16 12:45:52.312711 systemd-networkd[1521]: docker0: Link UP Dec 16 12:45:52.316693 dockerd[1895]: time="2025-12-16T12:45:52.316652303Z" level=info msg="Loading containers: done." Dec 16 12:45:52.329131 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck344312186-merged.mount: Deactivated successfully. Dec 16 12:45:52.335040 dockerd[1895]: time="2025-12-16T12:45:52.334991278Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 12:45:52.335165 dockerd[1895]: time="2025-12-16T12:45:52.335065818Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 12:45:52.335165 dockerd[1895]: time="2025-12-16T12:45:52.335125940Z" level=info msg="Initializing buildkit" Dec 16 12:45:52.355116 dockerd[1895]: time="2025-12-16T12:45:52.355076338Z" level=info msg="Completed buildkit initialization" Dec 16 12:45:52.363395 dockerd[1895]: time="2025-12-16T12:45:52.363357809Z" level=info msg="Daemon has completed initialization" Dec 16 12:45:52.363612 dockerd[1895]: time="2025-12-16T12:45:52.363561431Z" level=info msg="API listen on /run/docker.sock" Dec 16 12:45:52.363780 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 12:45:52.363000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:53.307158 containerd[1624]: time="2025-12-16T12:45:53.307088844Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 16 12:45:54.073444 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4263464699.mount: Deactivated successfully. Dec 16 12:45:54.824670 containerd[1624]: time="2025-12-16T12:45:54.824606364Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:54.826139 containerd[1624]: time="2025-12-16T12:45:54.825807047Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=25399329" Dec 16 12:45:54.827033 containerd[1624]: time="2025-12-16T12:45:54.827000467Z" level=info msg="ImageCreate event name:\"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:54.830065 containerd[1624]: time="2025-12-16T12:45:54.830035085Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:54.831356 containerd[1624]: time="2025-12-16T12:45:54.831302918Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"27064672\" in 1.524147719s" Dec 16 12:45:54.831472 containerd[1624]: time="2025-12-16T12:45:54.831453680Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\"" Dec 16 12:45:54.832109 containerd[1624]: time="2025-12-16T12:45:54.832068997Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 16 12:45:56.067348 containerd[1624]: time="2025-12-16T12:45:56.067278099Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:56.068652 containerd[1624]: time="2025-12-16T12:45:56.068632574Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=21154285" Dec 16 12:45:56.070631 containerd[1624]: time="2025-12-16T12:45:56.070594913Z" level=info msg="ImageCreate event name:\"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:56.073549 containerd[1624]: time="2025-12-16T12:45:56.073510162Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:56.074533 containerd[1624]: time="2025-12-16T12:45:56.074450000Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"22819474\" in 1.242333372s" Dec 16 12:45:56.074533 containerd[1624]: time="2025-12-16T12:45:56.074489750Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\"" Dec 16 12:45:56.076864 containerd[1624]: time="2025-12-16T12:45:56.076829876Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 16 12:45:57.028905 containerd[1624]: time="2025-12-16T12:45:57.028856739Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:57.029871 containerd[1624]: time="2025-12-16T12:45:57.029671779Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=15717792" Dec 16 12:45:57.030717 containerd[1624]: time="2025-12-16T12:45:57.030693092Z" level=info msg="ImageCreate event name:\"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:57.032785 containerd[1624]: time="2025-12-16T12:45:57.032765572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:57.033275 containerd[1624]: time="2025-12-16T12:45:57.033248580Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"17382979\" in 956.386673ms" Dec 16 12:45:57.033314 containerd[1624]: time="2025-12-16T12:45:57.033276560Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\"" Dec 16 12:45:57.033632 containerd[1624]: time="2025-12-16T12:45:57.033610523Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 16 12:45:57.963485 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2309300273.mount: Deactivated successfully. Dec 16 12:45:58.186978 containerd[1624]: time="2025-12-16T12:45:58.186914649Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:58.187968 containerd[1624]: time="2025-12-16T12:45:58.187865363Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=0" Dec 16 12:45:58.188780 containerd[1624]: time="2025-12-16T12:45:58.188754706Z" level=info msg="ImageCreate event name:\"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:58.190499 containerd[1624]: time="2025-12-16T12:45:58.190452653Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:58.190940 containerd[1624]: time="2025-12-16T12:45:58.190920821Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"25964312\" in 1.157284467s" Dec 16 12:45:58.191018 containerd[1624]: time="2025-12-16T12:45:58.191005477Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\"" Dec 16 12:45:58.191491 containerd[1624]: time="2025-12-16T12:45:58.191468627Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 16 12:45:58.675204 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2117162848.mount: Deactivated successfully. Dec 16 12:45:59.309062 containerd[1624]: time="2025-12-16T12:45:59.308985631Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:59.310228 containerd[1624]: time="2025-12-16T12:45:59.310190293Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=21568511" Dec 16 12:45:59.310639 containerd[1624]: time="2025-12-16T12:45:59.310598742Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:59.312702 containerd[1624]: time="2025-12-16T12:45:59.312662475Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:59.313552 containerd[1624]: time="2025-12-16T12:45:59.313388444Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.121808156s" Dec 16 12:45:59.313552 containerd[1624]: time="2025-12-16T12:45:59.313412552Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Dec 16 12:45:59.314057 containerd[1624]: time="2025-12-16T12:45:59.314034574Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 16 12:45:59.387989 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 12:45:59.389929 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:45:59.495715 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:45:59.499992 kernel: kauditd_printk_skb: 132 callbacks suppressed Dec 16 12:45:59.500095 kernel: audit: type=1130 audit(1765889159.494:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:59.494000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:45:59.503983 (kubelet)[2241]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:45:59.542220 kubelet[2241]: E1216 12:45:59.542159 2241 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:45:59.544379 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:45:59.544519 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:45:59.544902 systemd[1]: kubelet.service: Consumed 125ms CPU time, 110.1M memory peak. Dec 16 12:45:59.544000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:45:59.551600 kernel: audit: type=1131 audit(1765889159.544:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:45:59.788993 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2475506598.mount: Deactivated successfully. Dec 16 12:45:59.794856 containerd[1624]: time="2025-12-16T12:45:59.794817475Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:59.795865 containerd[1624]: time="2025-12-16T12:45:59.795698121Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Dec 16 12:45:59.796767 containerd[1624]: time="2025-12-16T12:45:59.796715139Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:59.799578 containerd[1624]: time="2025-12-16T12:45:59.798881965Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:45:59.799578 containerd[1624]: time="2025-12-16T12:45:59.799420861Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 485.361675ms" Dec 16 12:45:59.799578 containerd[1624]: time="2025-12-16T12:45:59.799445493Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Dec 16 12:45:59.800081 containerd[1624]: time="2025-12-16T12:45:59.800047086Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 16 12:46:00.299170 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount777889211.mount: Deactivated successfully. Dec 16 12:46:06.466605 containerd[1624]: time="2025-12-16T12:46:06.466550773Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:06.467555 containerd[1624]: time="2025-12-16T12:46:06.467474811Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=72348001" Dec 16 12:46:06.468544 containerd[1624]: time="2025-12-16T12:46:06.468487829Z" level=info msg="ImageCreate event name:\"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:06.470827 containerd[1624]: time="2025-12-16T12:46:06.470551717Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:06.471607 containerd[1624]: time="2025-12-16T12:46:06.471578108Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"74311308\" in 6.671500998s" Dec 16 12:46:06.471607 containerd[1624]: time="2025-12-16T12:46:06.471603852Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\"" Dec 16 12:46:09.210194 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:46:09.210353 systemd[1]: kubelet.service: Consumed 125ms CPU time, 110.1M memory peak. Dec 16 12:46:09.208000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:09.214132 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:46:09.218710 kernel: audit: type=1130 audit(1765889169.208:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:09.208000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:09.226891 kernel: audit: type=1131 audit(1765889169.208:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:09.247629 systemd[1]: Reload requested from client PID 2333 ('systemctl') (unit session-8.scope)... Dec 16 12:46:09.247658 systemd[1]: Reloading... Dec 16 12:46:09.378599 zram_generator::config[2386]: No configuration found. Dec 16 12:46:09.552874 systemd[1]: Reloading finished in 304 ms. Dec 16 12:46:09.577445 kernel: audit: type=1334 audit(1765889169.568:294): prog-id=63 op=LOAD Dec 16 12:46:09.577576 kernel: audit: type=1334 audit(1765889169.568:295): prog-id=52 op=UNLOAD Dec 16 12:46:09.568000 audit: BPF prog-id=63 op=LOAD Dec 16 12:46:09.568000 audit: BPF prog-id=52 op=UNLOAD Dec 16 12:46:09.581580 kernel: audit: type=1334 audit(1765889169.568:296): prog-id=64 op=LOAD Dec 16 12:46:09.568000 audit: BPF prog-id=64 op=LOAD Dec 16 12:46:09.568000 audit: BPF prog-id=65 op=LOAD Dec 16 12:46:09.585980 kernel: audit: type=1334 audit(1765889169.568:297): prog-id=65 op=LOAD Dec 16 12:46:09.586056 kernel: audit: type=1334 audit(1765889169.568:298): prog-id=53 op=UNLOAD Dec 16 12:46:09.568000 audit: BPF prog-id=53 op=UNLOAD Dec 16 12:46:09.568000 audit: BPF prog-id=54 op=UNLOAD Dec 16 12:46:09.569000 audit: BPF prog-id=66 op=LOAD Dec 16 12:46:09.589124 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 12:46:09.589184 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 12:46:09.589384 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:46:09.589430 systemd[1]: kubelet.service: Consumed 79ms CPU time, 97.8M memory peak. Dec 16 12:46:09.590812 kernel: audit: type=1334 audit(1765889169.568:299): prog-id=54 op=UNLOAD Dec 16 12:46:09.590863 kernel: audit: type=1334 audit(1765889169.569:300): prog-id=66 op=LOAD Dec 16 12:46:09.569000 audit: BPF prog-id=58 op=UNLOAD Dec 16 12:46:09.591806 kernel: audit: type=1334 audit(1765889169.569:301): prog-id=58 op=UNLOAD Dec 16 12:46:09.591657 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:46:09.571000 audit: BPF prog-id=67 op=LOAD Dec 16 12:46:09.571000 audit: BPF prog-id=49 op=UNLOAD Dec 16 12:46:09.571000 audit: BPF prog-id=68 op=LOAD Dec 16 12:46:09.571000 audit: BPF prog-id=43 op=UNLOAD Dec 16 12:46:09.571000 audit: BPF prog-id=69 op=LOAD Dec 16 12:46:09.571000 audit: BPF prog-id=70 op=LOAD Dec 16 12:46:09.571000 audit: BPF prog-id=44 op=UNLOAD Dec 16 12:46:09.571000 audit: BPF prog-id=45 op=UNLOAD Dec 16 12:46:09.573000 audit: BPF prog-id=71 op=LOAD Dec 16 12:46:09.573000 audit: BPF prog-id=55 op=UNLOAD Dec 16 12:46:09.573000 audit: BPF prog-id=72 op=LOAD Dec 16 12:46:09.573000 audit: BPF prog-id=73 op=LOAD Dec 16 12:46:09.573000 audit: BPF prog-id=56 op=UNLOAD Dec 16 12:46:09.573000 audit: BPF prog-id=57 op=UNLOAD Dec 16 12:46:09.573000 audit: BPF prog-id=74 op=LOAD Dec 16 12:46:09.573000 audit: BPF prog-id=75 op=LOAD Dec 16 12:46:09.573000 audit: BPF prog-id=50 op=UNLOAD Dec 16 12:46:09.573000 audit: BPF prog-id=51 op=UNLOAD Dec 16 12:46:09.575000 audit: BPF prog-id=76 op=LOAD Dec 16 12:46:09.575000 audit: BPF prog-id=46 op=UNLOAD Dec 16 12:46:09.575000 audit: BPF prog-id=77 op=LOAD Dec 16 12:46:09.575000 audit: BPF prog-id=78 op=LOAD Dec 16 12:46:09.575000 audit: BPF prog-id=47 op=UNLOAD Dec 16 12:46:09.575000 audit: BPF prog-id=48 op=UNLOAD Dec 16 12:46:09.577000 audit: BPF prog-id=79 op=LOAD Dec 16 12:46:09.577000 audit: BPF prog-id=60 op=UNLOAD Dec 16 12:46:09.577000 audit: BPF prog-id=80 op=LOAD Dec 16 12:46:09.577000 audit: BPF prog-id=81 op=LOAD Dec 16 12:46:09.577000 audit: BPF prog-id=61 op=UNLOAD Dec 16 12:46:09.577000 audit: BPF prog-id=62 op=UNLOAD Dec 16 12:46:09.577000 audit: BPF prog-id=82 op=LOAD Dec 16 12:46:09.577000 audit: BPF prog-id=59 op=UNLOAD Dec 16 12:46:09.588000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:46:09.737869 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:46:09.738000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:09.743763 (kubelet)[2432]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:46:09.800518 kubelet[2432]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:46:09.800518 kubelet[2432]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:46:09.802031 kubelet[2432]: I1216 12:46:09.801975 2432 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:46:10.106841 kubelet[2432]: I1216 12:46:10.106772 2432 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 12:46:10.106841 kubelet[2432]: I1216 12:46:10.106799 2432 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:46:10.109783 kubelet[2432]: I1216 12:46:10.109729 2432 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 12:46:10.109783 kubelet[2432]: I1216 12:46:10.109748 2432 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:46:10.110033 kubelet[2432]: I1216 12:46:10.109990 2432 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:46:10.133092 kubelet[2432]: I1216 12:46:10.133033 2432 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:46:10.141260 kubelet[2432]: E1216 12:46:10.141212 2432 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://77.42.23.34:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 12:46:10.149834 kubelet[2432]: I1216 12:46:10.149793 2432 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:46:10.156946 kubelet[2432]: I1216 12:46:10.156884 2432 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 12:46:10.158025 kubelet[2432]: I1216 12:46:10.157955 2432 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:46:10.159902 kubelet[2432]: I1216 12:46:10.158016 2432 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-c-452f5360ea","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:46:10.159902 kubelet[2432]: I1216 12:46:10.159900 2432 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:46:10.160048 kubelet[2432]: I1216 12:46:10.159915 2432 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 12:46:10.160048 kubelet[2432]: I1216 12:46:10.160030 2432 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 12:46:10.164448 kubelet[2432]: I1216 12:46:10.164410 2432 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:46:10.165574 kubelet[2432]: I1216 12:46:10.165168 2432 kubelet.go:475] "Attempting to sync node with API server" Dec 16 12:46:10.165574 kubelet[2432]: I1216 12:46:10.165190 2432 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:46:10.165574 kubelet[2432]: I1216 12:46:10.165213 2432 kubelet.go:387] "Adding apiserver pod source" Dec 16 12:46:10.165574 kubelet[2432]: I1216 12:46:10.165230 2432 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:46:10.165770 kubelet[2432]: E1216 12:46:10.165730 2432 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://77.42.23.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-c-452f5360ea&limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:46:10.168331 kubelet[2432]: E1216 12:46:10.168173 2432 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://77.42.23.34:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:46:10.168909 kubelet[2432]: I1216 12:46:10.168894 2432 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:46:10.172479 kubelet[2432]: I1216 12:46:10.172381 2432 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:46:10.172479 kubelet[2432]: I1216 12:46:10.172414 2432 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 12:46:10.176570 kubelet[2432]: W1216 12:46:10.175507 2432 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 12:46:10.183123 kubelet[2432]: I1216 12:46:10.183110 2432 server.go:1262] "Started kubelet" Dec 16 12:46:10.190753 kubelet[2432]: I1216 12:46:10.190727 2432 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:46:10.199172 kubelet[2432]: E1216 12:46:10.197871 2432 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://77.42.23.34:6443/api/v1/namespaces/default/events\": dial tcp 77.42.23.34:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-c-452f5360ea.1881b2d81d78e290 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-c-452f5360ea,UID:ci-4547-0-0-c-452f5360ea,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-c-452f5360ea,},FirstTimestamp:2025-12-16 12:46:10.183086736 +0000 UTC m=+0.434011335,LastTimestamp:2025-12-16 12:46:10.183086736 +0000 UTC m=+0.434011335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-c-452f5360ea,}" Dec 16 12:46:10.202035 kubelet[2432]: I1216 12:46:10.201128 2432 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 12:46:10.202035 kubelet[2432]: E1216 12:46:10.201351 2432 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-c-452f5360ea\" not found" Dec 16 12:46:10.202119 kubelet[2432]: I1216 12:46:10.202086 2432 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 12:46:10.202418 kubelet[2432]: I1216 12:46:10.202160 2432 reconciler.go:29] "Reconciler: start to sync state" Dec 16 12:46:10.202619 kubelet[2432]: I1216 12:46:10.202389 2432 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:46:10.208104 kubelet[2432]: I1216 12:46:10.208070 2432 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:46:10.209137 kubelet[2432]: E1216 12:46:10.209104 2432 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://77.42.23.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:46:10.209278 kubelet[2432]: E1216 12:46:10.209218 2432 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.23.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-c-452f5360ea?timeout=10s\": dial tcp 77.42.23.34:6443: connect: connection refused" interval="200ms" Dec 16 12:46:10.209385 kubelet[2432]: I1216 12:46:10.209371 2432 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:46:10.209668 kubelet[2432]: I1216 12:46:10.209646 2432 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:46:10.210631 kubelet[2432]: I1216 12:46:10.210591 2432 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:46:10.210676 kubelet[2432]: I1216 12:46:10.210651 2432 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 12:46:10.210992 kubelet[2432]: I1216 12:46:10.210967 2432 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:46:10.212950 kubelet[2432]: I1216 12:46:10.212935 2432 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:46:10.217316 kubelet[2432]: I1216 12:46:10.217281 2432 server.go:310] "Adding debug handlers to kubelet server" Dec 16 12:46:10.218000 audit[2448]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2448 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:10.218000 audit[2448]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffc6debe350 a2=0 a3=0 items=0 ppid=2432 pid=2448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.218000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:46:10.221140 kubelet[2432]: E1216 12:46:10.221106 2432 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:46:10.221000 audit[2452]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2452 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:10.221000 audit[2452]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff387c6210 a2=0 a3=0 items=0 ppid=2432 pid=2452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.221000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:46:10.225000 audit[2454]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2454 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:10.225000 audit[2454]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffea89249f0 a2=0 a3=0 items=0 ppid=2432 pid=2454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.225000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:46:10.228000 audit[2456]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2456 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:10.228000 audit[2456]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc110b83a0 a2=0 a3=0 items=0 ppid=2432 pid=2456 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.228000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:46:10.235652 kubelet[2432]: I1216 12:46:10.235639 2432 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:46:10.235745 kubelet[2432]: I1216 12:46:10.235736 2432 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:46:10.235851 kubelet[2432]: I1216 12:46:10.235841 2432 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:46:10.241318 kubelet[2432]: I1216 12:46:10.241141 2432 policy_none.go:49] "None policy: Start" Dec 16 12:46:10.241318 kubelet[2432]: I1216 12:46:10.241161 2432 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 12:46:10.241318 kubelet[2432]: I1216 12:46:10.241170 2432 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 12:46:10.239000 audit[2459]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2459 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:10.239000 audit[2459]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffde755f4e0 a2=0 a3=0 items=0 ppid=2432 pid=2459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.239000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Dec 16 12:46:10.241966 kubelet[2432]: I1216 12:46:10.241920 2432 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 12:46:10.241000 audit[2460]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2460 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:10.241000 audit[2460]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff4c825c40 a2=0 a3=0 items=0 ppid=2432 pid=2460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.241000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:46:10.242000 audit[2461]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2461 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:10.242000 audit[2461]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcf87cf5e0 a2=0 a3=0 items=0 ppid=2432 pid=2461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.242000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:46:10.244342 kubelet[2432]: I1216 12:46:10.243716 2432 policy_none.go:47] "Start" Dec 16 12:46:10.244574 kubelet[2432]: I1216 12:46:10.244420 2432 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 12:46:10.244574 kubelet[2432]: I1216 12:46:10.244438 2432 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 12:46:10.243000 audit[2462]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2462 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:10.243000 audit[2462]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc621aadf0 a2=0 a3=0 items=0 ppid=2432 pid=2462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.243000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:46:10.245374 kubelet[2432]: I1216 12:46:10.244661 2432 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 12:46:10.245374 kubelet[2432]: E1216 12:46:10.244694 2432 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:46:10.244000 audit[2464]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_chain pid=2464 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:10.244000 audit[2464]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdce38cba0 a2=0 a3=0 items=0 ppid=2432 pid=2464 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.247491 kubelet[2432]: E1216 12:46:10.247452 2432 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://77.42.23.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 12:46:10.244000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:46:10.246000 audit[2465]: NETFILTER_CFG table=mangle:51 family=10 entries=1 op=nft_register_chain pid=2465 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:10.246000 audit[2465]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe91d97d60 a2=0 a3=0 items=0 ppid=2432 pid=2465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.246000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:46:10.248000 audit[2467]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2467 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:10.248000 audit[2467]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc65bc94e0 a2=0 a3=0 items=0 ppid=2432 pid=2467 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.248000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:46:10.250000 audit[2468]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2468 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:10.250000 audit[2468]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd515d3850 a2=0 a3=0 items=0 ppid=2432 pid=2468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:10.250000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:46:10.252637 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 12:46:10.261390 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 12:46:10.265048 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 12:46:10.274236 kubelet[2432]: E1216 12:46:10.274207 2432 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:46:10.274523 kubelet[2432]: I1216 12:46:10.274510 2432 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:46:10.274673 kubelet[2432]: I1216 12:46:10.274644 2432 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:46:10.275012 kubelet[2432]: I1216 12:46:10.274993 2432 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:46:10.277491 kubelet[2432]: E1216 12:46:10.277142 2432 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:46:10.277643 kubelet[2432]: E1216 12:46:10.277624 2432 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-c-452f5360ea\" not found" Dec 16 12:46:10.362780 systemd[1]: Created slice kubepods-burstable-podf339c371ed771e712fed1f0ae6cb4bae.slice - libcontainer container kubepods-burstable-podf339c371ed771e712fed1f0ae6cb4bae.slice. Dec 16 12:46:10.377918 kubelet[2432]: I1216 12:46:10.377878 2432 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.378541 kubelet[2432]: E1216 12:46:10.378490 2432 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-c-452f5360ea\" not found" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.379266 kubelet[2432]: E1216 12:46:10.379204 2432 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.23.34:6443/api/v1/nodes\": dial tcp 77.42.23.34:6443: connect: connection refused" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.382523 systemd[1]: Created slice kubepods-burstable-podf4148a05de67f9b028b38d4dee4449f2.slice - libcontainer container kubepods-burstable-podf4148a05de67f9b028b38d4dee4449f2.slice. Dec 16 12:46:10.403057 kubelet[2432]: E1216 12:46:10.402881 2432 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-c-452f5360ea\" not found" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.403496 kubelet[2432]: I1216 12:46:10.403319 2432 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/84a7624c597f83bbc4e831003f342640-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-c-452f5360ea\" (UID: \"84a7624c597f83bbc4e831003f342640\") " pod="kube-system/kube-scheduler-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.403496 kubelet[2432]: I1216 12:46:10.403491 2432 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f4148a05de67f9b028b38d4dee4449f2-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-c-452f5360ea\" (UID: \"f4148a05de67f9b028b38d4dee4449f2\") " pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.404283 kubelet[2432]: I1216 12:46:10.403606 2432 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.404283 kubelet[2432]: I1216 12:46:10.403957 2432 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f4148a05de67f9b028b38d4dee4449f2-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-c-452f5360ea\" (UID: \"f4148a05de67f9b028b38d4dee4449f2\") " pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.404283 kubelet[2432]: I1216 12:46:10.403986 2432 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f4148a05de67f9b028b38d4dee4449f2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-c-452f5360ea\" (UID: \"f4148a05de67f9b028b38d4dee4449f2\") " pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.404283 kubelet[2432]: I1216 12:46:10.404039 2432 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.404283 kubelet[2432]: I1216 12:46:10.404061 2432 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.404523 kubelet[2432]: I1216 12:46:10.404074 2432 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.404523 kubelet[2432]: I1216 12:46:10.404127 2432 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.407233 systemd[1]: Created slice kubepods-burstable-pod84a7624c597f83bbc4e831003f342640.slice - libcontainer container kubepods-burstable-pod84a7624c597f83bbc4e831003f342640.slice. Dec 16 12:46:10.409381 kubelet[2432]: E1216 12:46:10.409351 2432 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-c-452f5360ea\" not found" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.409968 kubelet[2432]: E1216 12:46:10.409904 2432 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.23.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-c-452f5360ea?timeout=10s\": dial tcp 77.42.23.34:6443: connect: connection refused" interval="400ms" Dec 16 12:46:10.581823 kubelet[2432]: I1216 12:46:10.581749 2432 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.582302 kubelet[2432]: E1216 12:46:10.582240 2432 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.23.34:6443/api/v1/nodes\": dial tcp 77.42.23.34:6443: connect: connection refused" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.683588 containerd[1624]: time="2025-12-16T12:46:10.683324695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-c-452f5360ea,Uid:f339c371ed771e712fed1f0ae6cb4bae,Namespace:kube-system,Attempt:0,}" Dec 16 12:46:10.706217 containerd[1624]: time="2025-12-16T12:46:10.706146984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-c-452f5360ea,Uid:f4148a05de67f9b028b38d4dee4449f2,Namespace:kube-system,Attempt:0,}" Dec 16 12:46:10.712346 containerd[1624]: time="2025-12-16T12:46:10.712294514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-c-452f5360ea,Uid:84a7624c597f83bbc4e831003f342640,Namespace:kube-system,Attempt:0,}" Dec 16 12:46:10.810956 kubelet[2432]: E1216 12:46:10.810874 2432 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.23.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-c-452f5360ea?timeout=10s\": dial tcp 77.42.23.34:6443: connect: connection refused" interval="800ms" Dec 16 12:46:10.985313 kubelet[2432]: I1216 12:46:10.984936 2432 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:10.985686 kubelet[2432]: E1216 12:46:10.985618 2432 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.23.34:6443/api/v1/nodes\": dial tcp 77.42.23.34:6443: connect: connection refused" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:11.106402 kubelet[2432]: E1216 12:46:11.106345 2432 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://77.42.23.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 12:46:11.180501 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3872551726.mount: Deactivated successfully. Dec 16 12:46:11.185889 containerd[1624]: time="2025-12-16T12:46:11.185354229Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:46:11.186710 containerd[1624]: time="2025-12-16T12:46:11.186692419Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:46:11.190270 containerd[1624]: time="2025-12-16T12:46:11.190224859Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:46:11.191493 containerd[1624]: time="2025-12-16T12:46:11.191440252Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:46:11.193657 containerd[1624]: time="2025-12-16T12:46:11.193560850Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:46:11.195007 containerd[1624]: time="2025-12-16T12:46:11.194964189Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:46:11.195795 containerd[1624]: time="2025-12-16T12:46:11.195755349Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:46:11.197253 containerd[1624]: time="2025-12-16T12:46:11.197143689Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:46:11.198442 containerd[1624]: time="2025-12-16T12:46:11.197577076Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 506.190004ms" Dec 16 12:46:11.198799 containerd[1624]: time="2025-12-16T12:46:11.198727493Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 484.31661ms" Dec 16 12:46:11.201817 containerd[1624]: time="2025-12-16T12:46:11.201730917Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 493.215077ms" Dec 16 12:46:11.314389 kubelet[2432]: E1216 12:46:11.313601 2432 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://77.42.23.34:6443/api/v1/namespaces/default/events\": dial tcp 77.42.23.34:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-c-452f5360ea.1881b2d81d78e290 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-c-452f5360ea,UID:ci-4547-0-0-c-452f5360ea,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-c-452f5360ea,},FirstTimestamp:2025-12-16 12:46:10.183086736 +0000 UTC m=+0.434011335,LastTimestamp:2025-12-16 12:46:10.183086736 +0000 UTC m=+0.434011335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-c-452f5360ea,}" Dec 16 12:46:11.320970 containerd[1624]: time="2025-12-16T12:46:11.320907069Z" level=info msg="connecting to shim c136070a436d7c9a0b4df15a4476da693a1c9a34fe0ad76af60a348d79c2118e" address="unix:///run/containerd/s/182570ef8b40130a787c5e0caea0d640201bd5733f6bcc2322f0eec1b46193c6" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:46:11.331660 containerd[1624]: time="2025-12-16T12:46:11.331604753Z" level=info msg="connecting to shim f4468fd9d0dad3406d86c94157e59de0a077e81ba3c30ef22106a807e960e4de" address="unix:///run/containerd/s/3ed8244b314c48a7828785cebabb7e568327b088ed48a719a2908f2a152beec6" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:46:11.339554 containerd[1624]: time="2025-12-16T12:46:11.339480713Z" level=info msg="connecting to shim dd6cdce9f0c6f3b6112cbd29d6a984c0e9589f46534b684cd7f990801c0d8127" address="unix:///run/containerd/s/b80f004de849536c7efe2675949759511f0ddd05c76c59c4db4d88989c736b4a" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:46:11.344238 kubelet[2432]: E1216 12:46:11.344187 2432 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://77.42.23.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-c-452f5360ea&limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:46:11.372884 kubelet[2432]: E1216 12:46:11.372609 2432 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://77.42.23.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:46:11.431972 systemd[1]: Started cri-containerd-c136070a436d7c9a0b4df15a4476da693a1c9a34fe0ad76af60a348d79c2118e.scope - libcontainer container c136070a436d7c9a0b4df15a4476da693a1c9a34fe0ad76af60a348d79c2118e. Dec 16 12:46:11.435400 systemd[1]: Started cri-containerd-dd6cdce9f0c6f3b6112cbd29d6a984c0e9589f46534b684cd7f990801c0d8127.scope - libcontainer container dd6cdce9f0c6f3b6112cbd29d6a984c0e9589f46534b684cd7f990801c0d8127. Dec 16 12:46:11.438482 systemd[1]: Started cri-containerd-f4468fd9d0dad3406d86c94157e59de0a077e81ba3c30ef22106a807e960e4de.scope - libcontainer container f4468fd9d0dad3406d86c94157e59de0a077e81ba3c30ef22106a807e960e4de. Dec 16 12:46:11.469631 kubelet[2432]: E1216 12:46:11.469502 2432 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://77.42.23.34:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:46:11.475000 audit: BPF prog-id=83 op=LOAD Dec 16 12:46:11.478000 audit: BPF prog-id=84 op=LOAD Dec 16 12:46:11.479000 audit: BPF prog-id=85 op=LOAD Dec 16 12:46:11.479000 audit[2531]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2501 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634343638666439643064616433343036643836633934313537653539 Dec 16 12:46:11.479000 audit: BPF prog-id=85 op=UNLOAD Dec 16 12:46:11.479000 audit[2531]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634343638666439643064616433343036643836633934313537653539 Dec 16 12:46:11.480000 audit: BPF prog-id=86 op=LOAD Dec 16 12:46:11.480000 audit[2531]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2501 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.480000 audit: BPF prog-id=87 op=LOAD Dec 16 12:46:11.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634343638666439643064616433343036643836633934313537653539 Dec 16 12:46:11.480000 audit: BPF prog-id=88 op=LOAD Dec 16 12:46:11.480000 audit[2529]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2504 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464366364636539663063366633623631313263626432396436613938 Dec 16 12:46:11.480000 audit: BPF prog-id=87 op=UNLOAD Dec 16 12:46:11.480000 audit[2531]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2501 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.480000 audit[2529]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2504 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634343638666439643064616433343036643836633934313537653539 Dec 16 12:46:11.480000 audit: BPF prog-id=88 op=UNLOAD Dec 16 12:46:11.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464366364636539663063366633623631313263626432396436613938 Dec 16 12:46:11.480000 audit[2531]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634343638666439643064616433343036643836633934313537653539 Dec 16 12:46:11.481000 audit: BPF prog-id=86 op=UNLOAD Dec 16 12:46:11.481000 audit[2531]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634343638666439643064616433343036643836633934313537653539 Dec 16 12:46:11.481000 audit: BPF prog-id=89 op=LOAD Dec 16 12:46:11.481000 audit[2529]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2504 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464366364636539663063366633623631313263626432396436613938 Dec 16 12:46:11.481000 audit: BPF prog-id=90 op=LOAD Dec 16 12:46:11.481000 audit[2529]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2504 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464366364636539663063366633623631313263626432396436613938 Dec 16 12:46:11.482000 audit: BPF prog-id=90 op=UNLOAD Dec 16 12:46:11.482000 audit[2529]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2504 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464366364636539663063366633623631313263626432396436613938 Dec 16 12:46:11.482000 audit: BPF prog-id=89 op=UNLOAD Dec 16 12:46:11.482000 audit[2529]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2504 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464366364636539663063366633623631313263626432396436613938 Dec 16 12:46:11.481000 audit: BPF prog-id=91 op=LOAD Dec 16 12:46:11.481000 audit[2531]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2501 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634343638666439643064616433343036643836633934313537653539 Dec 16 12:46:11.482000 audit: BPF prog-id=92 op=LOAD Dec 16 12:46:11.482000 audit[2529]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2504 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464366364636539663063366633623631313263626432396436613938 Dec 16 12:46:11.484000 audit: BPF prog-id=93 op=LOAD Dec 16 12:46:11.485000 audit: BPF prog-id=94 op=LOAD Dec 16 12:46:11.485000 audit[2533]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2489 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331333630373061343336643763396130623464663135613434373664 Dec 16 12:46:11.485000 audit: BPF prog-id=94 op=UNLOAD Dec 16 12:46:11.485000 audit[2533]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331333630373061343336643763396130623464663135613434373664 Dec 16 12:46:11.485000 audit: BPF prog-id=95 op=LOAD Dec 16 12:46:11.485000 audit[2533]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2489 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331333630373061343336643763396130623464663135613434373664 Dec 16 12:46:11.485000 audit: BPF prog-id=96 op=LOAD Dec 16 12:46:11.485000 audit[2533]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2489 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331333630373061343336643763396130623464663135613434373664 Dec 16 12:46:11.485000 audit: BPF prog-id=96 op=UNLOAD Dec 16 12:46:11.485000 audit[2533]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331333630373061343336643763396130623464663135613434373664 Dec 16 12:46:11.485000 audit: BPF prog-id=95 op=UNLOAD Dec 16 12:46:11.485000 audit[2533]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331333630373061343336643763396130623464663135613434373664 Dec 16 12:46:11.485000 audit: BPF prog-id=97 op=LOAD Dec 16 12:46:11.485000 audit[2533]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2489 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.485000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331333630373061343336643763396130623464663135613434373664 Dec 16 12:46:11.526191 containerd[1624]: time="2025-12-16T12:46:11.525507432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-c-452f5360ea,Uid:84a7624c597f83bbc4e831003f342640,Namespace:kube-system,Attempt:0,} returns sandbox id \"c136070a436d7c9a0b4df15a4476da693a1c9a34fe0ad76af60a348d79c2118e\"" Dec 16 12:46:11.533717 containerd[1624]: time="2025-12-16T12:46:11.533677278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-c-452f5360ea,Uid:f339c371ed771e712fed1f0ae6cb4bae,Namespace:kube-system,Attempt:0,} returns sandbox id \"f4468fd9d0dad3406d86c94157e59de0a077e81ba3c30ef22106a807e960e4de\"" Dec 16 12:46:11.541283 containerd[1624]: time="2025-12-16T12:46:11.541249373Z" level=info msg="CreateContainer within sandbox \"c136070a436d7c9a0b4df15a4476da693a1c9a34fe0ad76af60a348d79c2118e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 12:46:11.541789 containerd[1624]: time="2025-12-16T12:46:11.541744640Z" level=info msg="CreateContainer within sandbox \"f4468fd9d0dad3406d86c94157e59de0a077e81ba3c30ef22106a807e960e4de\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 12:46:11.556776 containerd[1624]: time="2025-12-16T12:46:11.556573286Z" level=info msg="Container b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:46:11.556874 containerd[1624]: time="2025-12-16T12:46:11.556709288Z" level=info msg="Container de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:46:11.564491 containerd[1624]: time="2025-12-16T12:46:11.564392158Z" level=info msg="CreateContainer within sandbox \"f4468fd9d0dad3406d86c94157e59de0a077e81ba3c30ef22106a807e960e4de\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398\"" Dec 16 12:46:11.568941 containerd[1624]: time="2025-12-16T12:46:11.568177549Z" level=info msg="StartContainer for \"de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398\"" Dec 16 12:46:11.569316 containerd[1624]: time="2025-12-16T12:46:11.568593090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-c-452f5360ea,Uid:f4148a05de67f9b028b38d4dee4449f2,Namespace:kube-system,Attempt:0,} returns sandbox id \"dd6cdce9f0c6f3b6112cbd29d6a984c0e9589f46534b684cd7f990801c0d8127\"" Dec 16 12:46:11.569477 containerd[1624]: time="2025-12-16T12:46:11.569459164Z" level=info msg="CreateContainer within sandbox \"c136070a436d7c9a0b4df15a4476da693a1c9a34fe0ad76af60a348d79c2118e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2\"" Dec 16 12:46:11.570033 containerd[1624]: time="2025-12-16T12:46:11.570017845Z" level=info msg="StartContainer for \"b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2\"" Dec 16 12:46:11.571542 containerd[1624]: time="2025-12-16T12:46:11.571107203Z" level=info msg="connecting to shim b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2" address="unix:///run/containerd/s/182570ef8b40130a787c5e0caea0d640201bd5733f6bcc2322f0eec1b46193c6" protocol=ttrpc version=3 Dec 16 12:46:11.574255 containerd[1624]: time="2025-12-16T12:46:11.574140756Z" level=info msg="connecting to shim de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398" address="unix:///run/containerd/s/3ed8244b314c48a7828785cebabb7e568327b088ed48a719a2908f2a152beec6" protocol=ttrpc version=3 Dec 16 12:46:11.575674 containerd[1624]: time="2025-12-16T12:46:11.575644202Z" level=info msg="CreateContainer within sandbox \"dd6cdce9f0c6f3b6112cbd29d6a984c0e9589f46534b684cd7f990801c0d8127\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 12:46:11.589307 containerd[1624]: time="2025-12-16T12:46:11.589267330Z" level=info msg="Container 89575aeeee998a010132cf1dc71a9b1fb0ce9f67bef40b54b02284a338f35550: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:46:11.591735 systemd[1]: Started cri-containerd-b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2.scope - libcontainer container b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2. Dec 16 12:46:11.595306 systemd[1]: Started cri-containerd-de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398.scope - libcontainer container de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398. Dec 16 12:46:11.602725 containerd[1624]: time="2025-12-16T12:46:11.602603328Z" level=info msg="CreateContainer within sandbox \"dd6cdce9f0c6f3b6112cbd29d6a984c0e9589f46534b684cd7f990801c0d8127\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"89575aeeee998a010132cf1dc71a9b1fb0ce9f67bef40b54b02284a338f35550\"" Dec 16 12:46:11.603468 containerd[1624]: time="2025-12-16T12:46:11.603101402Z" level=info msg="StartContainer for \"89575aeeee998a010132cf1dc71a9b1fb0ce9f67bef40b54b02284a338f35550\"" Dec 16 12:46:11.605711 containerd[1624]: time="2025-12-16T12:46:11.605693184Z" level=info msg="connecting to shim 89575aeeee998a010132cf1dc71a9b1fb0ce9f67bef40b54b02284a338f35550" address="unix:///run/containerd/s/b80f004de849536c7efe2675949759511f0ddd05c76c59c4db4d88989c736b4a" protocol=ttrpc version=3 Dec 16 12:46:11.606000 audit: BPF prog-id=98 op=LOAD Dec 16 12:46:11.607000 audit: BPF prog-id=99 op=LOAD Dec 16 12:46:11.607000 audit[2608]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2489 pid=2608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.607000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232633565373133376434323133306131613764663230656139613632 Dec 16 12:46:11.607000 audit: BPF prog-id=99 op=UNLOAD Dec 16 12:46:11.607000 audit[2608]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.607000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232633565373133376434323133306131613764663230656139613632 Dec 16 12:46:11.607000 audit: BPF prog-id=100 op=LOAD Dec 16 12:46:11.607000 audit[2608]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2489 pid=2608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.607000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232633565373133376434323133306131613764663230656139613632 Dec 16 12:46:11.607000 audit: BPF prog-id=101 op=LOAD Dec 16 12:46:11.607000 audit[2608]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2489 pid=2608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.607000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232633565373133376434323133306131613764663230656139613632 Dec 16 12:46:11.607000 audit: BPF prog-id=101 op=UNLOAD Dec 16 12:46:11.607000 audit[2608]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.607000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232633565373133376434323133306131613764663230656139613632 Dec 16 12:46:11.607000 audit: BPF prog-id=100 op=UNLOAD Dec 16 12:46:11.607000 audit[2608]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=2608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.607000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232633565373133376434323133306131613764663230656139613632 Dec 16 12:46:11.607000 audit: BPF prog-id=102 op=LOAD Dec 16 12:46:11.607000 audit[2608]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2489 pid=2608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.607000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232633565373133376434323133306131613764663230656139613632 Dec 16 12:46:11.611426 kubelet[2432]: E1216 12:46:11.611400 2432 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.23.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-c-452f5360ea?timeout=10s\": dial tcp 77.42.23.34:6443: connect: connection refused" interval="1.6s" Dec 16 12:46:11.618000 audit: BPF prog-id=103 op=LOAD Dec 16 12:46:11.621000 audit: BPF prog-id=104 op=LOAD Dec 16 12:46:11.621000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2501 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465306661343434643363386232393234663432303035623864373762 Dec 16 12:46:11.621000 audit: BPF prog-id=104 op=UNLOAD Dec 16 12:46:11.621000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465306661343434643363386232393234663432303035623864373762 Dec 16 12:46:11.621000 audit: BPF prog-id=105 op=LOAD Dec 16 12:46:11.621000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2501 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465306661343434643363386232393234663432303035623864373762 Dec 16 12:46:11.621000 audit: BPF prog-id=106 op=LOAD Dec 16 12:46:11.621000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2501 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465306661343434643363386232393234663432303035623864373762 Dec 16 12:46:11.621000 audit: BPF prog-id=106 op=UNLOAD Dec 16 12:46:11.621000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465306661343434643363386232393234663432303035623864373762 Dec 16 12:46:11.621000 audit: BPF prog-id=105 op=UNLOAD Dec 16 12:46:11.621000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465306661343434643363386232393234663432303035623864373762 Dec 16 12:46:11.621000 audit: BPF prog-id=107 op=LOAD Dec 16 12:46:11.621000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2501 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465306661343434643363386232393234663432303035623864373762 Dec 16 12:46:11.630773 systemd[1]: Started cri-containerd-89575aeeee998a010132cf1dc71a9b1fb0ce9f67bef40b54b02284a338f35550.scope - libcontainer container 89575aeeee998a010132cf1dc71a9b1fb0ce9f67bef40b54b02284a338f35550. Dec 16 12:46:11.647000 audit: BPF prog-id=108 op=LOAD Dec 16 12:46:11.648000 audit: BPF prog-id=109 op=LOAD Dec 16 12:46:11.648000 audit[2645]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2504 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839353735616565656539393861303130313332636631646337316139 Dec 16 12:46:11.648000 audit: BPF prog-id=109 op=UNLOAD Dec 16 12:46:11.648000 audit[2645]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2504 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839353735616565656539393861303130313332636631646337316139 Dec 16 12:46:11.648000 audit: BPF prog-id=110 op=LOAD Dec 16 12:46:11.648000 audit[2645]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2504 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839353735616565656539393861303130313332636631646337316139 Dec 16 12:46:11.648000 audit: BPF prog-id=111 op=LOAD Dec 16 12:46:11.648000 audit[2645]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2504 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839353735616565656539393861303130313332636631646337316139 Dec 16 12:46:11.649000 audit: BPF prog-id=111 op=UNLOAD Dec 16 12:46:11.649000 audit[2645]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2504 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.649000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839353735616565656539393861303130313332636631646337316139 Dec 16 12:46:11.649000 audit: BPF prog-id=110 op=UNLOAD Dec 16 12:46:11.649000 audit[2645]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2504 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.649000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839353735616565656539393861303130313332636631646337316139 Dec 16 12:46:11.650000 audit: BPF prog-id=112 op=LOAD Dec 16 12:46:11.650000 audit[2645]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2504 pid=2645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:11.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839353735616565656539393861303130313332636631646337316139 Dec 16 12:46:11.658142 containerd[1624]: time="2025-12-16T12:46:11.658015741Z" level=info msg="StartContainer for \"b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2\" returns successfully" Dec 16 12:46:11.678561 containerd[1624]: time="2025-12-16T12:46:11.676782192Z" level=info msg="StartContainer for \"de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398\" returns successfully" Dec 16 12:46:11.708494 containerd[1624]: time="2025-12-16T12:46:11.708435060Z" level=info msg="StartContainer for \"89575aeeee998a010132cf1dc71a9b1fb0ce9f67bef40b54b02284a338f35550\" returns successfully" Dec 16 12:46:11.789303 kubelet[2432]: I1216 12:46:11.789260 2432 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:11.790200 kubelet[2432]: E1216 12:46:11.790162 2432 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.23.34:6443/api/v1/nodes\": dial tcp 77.42.23.34:6443: connect: connection refused" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:12.263097 kubelet[2432]: E1216 12:46:12.262690 2432 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-c-452f5360ea\" not found" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:12.263097 kubelet[2432]: E1216 12:46:12.262985 2432 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-c-452f5360ea\" not found" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:12.267809 kubelet[2432]: E1216 12:46:12.267746 2432 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-c-452f5360ea\" not found" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.271557 kubelet[2432]: E1216 12:46:13.270137 2432 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-c-452f5360ea\" not found" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.272429 kubelet[2432]: E1216 12:46:13.272160 2432 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-c-452f5360ea\" not found" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.392922 kubelet[2432]: I1216 12:46:13.392900 2432 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.489198 kubelet[2432]: E1216 12:46:13.489156 2432 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-0-0-c-452f5360ea\" not found" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.545055 kubelet[2432]: I1216 12:46:13.544891 2432 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.602733 kubelet[2432]: I1216 12:46:13.602687 2432 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.614655 kubelet[2432]: E1216 12:46:13.614623 2432 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-c-452f5360ea\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.614655 kubelet[2432]: I1216 12:46:13.614649 2432 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.618663 kubelet[2432]: E1216 12:46:13.618615 2432 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.618663 kubelet[2432]: I1216 12:46:13.618660 2432 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:13.620072 kubelet[2432]: E1216 12:46:13.620042 2432 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-c-452f5360ea\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:14.171583 kubelet[2432]: I1216 12:46:14.171501 2432 apiserver.go:52] "Watching apiserver" Dec 16 12:46:14.203062 kubelet[2432]: I1216 12:46:14.203005 2432 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 12:46:15.326066 kubelet[2432]: I1216 12:46:15.326035 2432 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:15.545389 kubelet[2432]: I1216 12:46:15.545345 2432 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:15.724084 systemd[1]: Reload requested from client PID 2709 ('systemctl') (unit session-8.scope)... Dec 16 12:46:15.724105 systemd[1]: Reloading... Dec 16 12:46:15.814592 zram_generator::config[2752]: No configuration found. Dec 16 12:46:16.030087 systemd[1]: Reloading finished in 305 ms. Dec 16 12:46:16.055823 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:46:16.076968 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 12:46:16.077329 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:46:16.076000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:16.078080 kernel: kauditd_printk_skb: 202 callbacks suppressed Dec 16 12:46:16.078128 kernel: audit: type=1131 audit(1765889176.076:396): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:16.079604 systemd[1]: kubelet.service: Consumed 785ms CPU time, 123.9M memory peak. Dec 16 12:46:16.085861 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:46:16.087000 audit: BPF prog-id=113 op=LOAD Dec 16 12:46:16.091723 kernel: audit: type=1334 audit(1765889176.087:397): prog-id=113 op=LOAD Dec 16 12:46:16.091774 kernel: audit: type=1334 audit(1765889176.087:398): prog-id=63 op=UNLOAD Dec 16 12:46:16.087000 audit: BPF prog-id=63 op=UNLOAD Dec 16 12:46:16.087000 audit: BPF prog-id=114 op=LOAD Dec 16 12:46:16.093904 kernel: audit: type=1334 audit(1765889176.087:399): prog-id=114 op=LOAD Dec 16 12:46:16.097558 kernel: audit: type=1334 audit(1765889176.087:400): prog-id=115 op=LOAD Dec 16 12:46:16.097603 kernel: audit: type=1334 audit(1765889176.087:401): prog-id=64 op=UNLOAD Dec 16 12:46:16.087000 audit: BPF prog-id=115 op=LOAD Dec 16 12:46:16.087000 audit: BPF prog-id=64 op=UNLOAD Dec 16 12:46:16.101869 kernel: audit: type=1334 audit(1765889176.087:402): prog-id=65 op=UNLOAD Dec 16 12:46:16.087000 audit: BPF prog-id=65 op=UNLOAD Dec 16 12:46:16.104312 kernel: audit: type=1334 audit(1765889176.090:403): prog-id=116 op=LOAD Dec 16 12:46:16.090000 audit: BPF prog-id=116 op=LOAD Dec 16 12:46:16.106623 kernel: audit: type=1334 audit(1765889176.090:404): prog-id=79 op=UNLOAD Dec 16 12:46:16.090000 audit: BPF prog-id=79 op=UNLOAD Dec 16 12:46:16.090000 audit: BPF prog-id=117 op=LOAD Dec 16 12:46:16.091000 audit: BPF prog-id=118 op=LOAD Dec 16 12:46:16.095000 audit: BPF prog-id=80 op=UNLOAD Dec 16 12:46:16.095000 audit: BPF prog-id=81 op=UNLOAD Dec 16 12:46:16.109575 kernel: audit: type=1334 audit(1765889176.090:405): prog-id=117 op=LOAD Dec 16 12:46:16.095000 audit: BPF prog-id=119 op=LOAD Dec 16 12:46:16.095000 audit: BPF prog-id=82 op=UNLOAD Dec 16 12:46:16.095000 audit: BPF prog-id=120 op=LOAD Dec 16 12:46:16.095000 audit: BPF prog-id=71 op=UNLOAD Dec 16 12:46:16.095000 audit: BPF prog-id=121 op=LOAD Dec 16 12:46:16.095000 audit: BPF prog-id=122 op=LOAD Dec 16 12:46:16.095000 audit: BPF prog-id=72 op=UNLOAD Dec 16 12:46:16.095000 audit: BPF prog-id=73 op=UNLOAD Dec 16 12:46:16.095000 audit: BPF prog-id=123 op=LOAD Dec 16 12:46:16.095000 audit: BPF prog-id=68 op=UNLOAD Dec 16 12:46:16.095000 audit: BPF prog-id=124 op=LOAD Dec 16 12:46:16.095000 audit: BPF prog-id=125 op=LOAD Dec 16 12:46:16.095000 audit: BPF prog-id=69 op=UNLOAD Dec 16 12:46:16.095000 audit: BPF prog-id=70 op=UNLOAD Dec 16 12:46:16.097000 audit: BPF prog-id=126 op=LOAD Dec 16 12:46:16.097000 audit: BPF prog-id=67 op=UNLOAD Dec 16 12:46:16.099000 audit: BPF prog-id=127 op=LOAD Dec 16 12:46:16.099000 audit: BPF prog-id=66 op=UNLOAD Dec 16 12:46:16.099000 audit: BPF prog-id=128 op=LOAD Dec 16 12:46:16.099000 audit: BPF prog-id=129 op=LOAD Dec 16 12:46:16.099000 audit: BPF prog-id=74 op=UNLOAD Dec 16 12:46:16.099000 audit: BPF prog-id=75 op=UNLOAD Dec 16 12:46:16.099000 audit: BPF prog-id=130 op=LOAD Dec 16 12:46:16.099000 audit: BPF prog-id=76 op=UNLOAD Dec 16 12:46:16.099000 audit: BPF prog-id=131 op=LOAD Dec 16 12:46:16.099000 audit: BPF prog-id=132 op=LOAD Dec 16 12:46:16.099000 audit: BPF prog-id=77 op=UNLOAD Dec 16 12:46:16.099000 audit: BPF prog-id=78 op=UNLOAD Dec 16 12:46:16.236000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:16.236085 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:46:16.244922 (kubelet)[2807]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:46:16.301591 kubelet[2807]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:46:16.301591 kubelet[2807]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:46:16.301591 kubelet[2807]: I1216 12:46:16.301506 2807 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:46:16.307799 kubelet[2807]: I1216 12:46:16.307755 2807 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 12:46:16.307799 kubelet[2807]: I1216 12:46:16.307776 2807 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:46:16.307799 kubelet[2807]: I1216 12:46:16.307796 2807 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 12:46:16.307799 kubelet[2807]: I1216 12:46:16.307801 2807 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:46:16.307982 kubelet[2807]: I1216 12:46:16.307954 2807 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:46:16.308848 kubelet[2807]: I1216 12:46:16.308827 2807 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 12:46:16.311221 kubelet[2807]: I1216 12:46:16.311192 2807 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:46:16.318703 kubelet[2807]: I1216 12:46:16.318665 2807 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:46:16.325579 kubelet[2807]: I1216 12:46:16.324332 2807 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 12:46:16.326010 kubelet[2807]: I1216 12:46:16.325964 2807 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:46:16.326681 kubelet[2807]: I1216 12:46:16.326012 2807 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-c-452f5360ea","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:46:16.326791 kubelet[2807]: I1216 12:46:16.326686 2807 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:46:16.326791 kubelet[2807]: I1216 12:46:16.326716 2807 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 12:46:16.326791 kubelet[2807]: I1216 12:46:16.326738 2807 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 12:46:16.328359 kubelet[2807]: I1216 12:46:16.328318 2807 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:46:16.328520 kubelet[2807]: I1216 12:46:16.328482 2807 kubelet.go:475] "Attempting to sync node with API server" Dec 16 12:46:16.328520 kubelet[2807]: I1216 12:46:16.328504 2807 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:46:16.329873 kubelet[2807]: I1216 12:46:16.329834 2807 kubelet.go:387] "Adding apiserver pod source" Dec 16 12:46:16.329873 kubelet[2807]: I1216 12:46:16.329864 2807 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:46:16.333552 kubelet[2807]: I1216 12:46:16.332317 2807 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:46:16.333552 kubelet[2807]: I1216 12:46:16.332962 2807 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:46:16.333552 kubelet[2807]: I1216 12:46:16.332986 2807 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 12:46:16.349268 kubelet[2807]: I1216 12:46:16.349220 2807 server.go:1262] "Started kubelet" Dec 16 12:46:16.352421 kubelet[2807]: I1216 12:46:16.350281 2807 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:46:16.352696 kubelet[2807]: I1216 12:46:16.352649 2807 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:46:16.357158 kubelet[2807]: I1216 12:46:16.357139 2807 server.go:310] "Adding debug handlers to kubelet server" Dec 16 12:46:16.368637 kubelet[2807]: I1216 12:46:16.367157 2807 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:46:16.368758 kubelet[2807]: I1216 12:46:16.368703 2807 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 12:46:16.370329 kubelet[2807]: I1216 12:46:16.369256 2807 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:46:16.374227 kubelet[2807]: I1216 12:46:16.374177 2807 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:46:16.375244 kubelet[2807]: I1216 12:46:16.375199 2807 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 12:46:16.375470 kubelet[2807]: E1216 12:46:16.375442 2807 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-c-452f5360ea\" not found" Dec 16 12:46:16.376487 kubelet[2807]: I1216 12:46:16.376464 2807 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 12:46:16.376656 kubelet[2807]: I1216 12:46:16.376624 2807 reconciler.go:29] "Reconciler: start to sync state" Dec 16 12:46:16.379897 kubelet[2807]: I1216 12:46:16.379866 2807 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:46:16.380034 kubelet[2807]: I1216 12:46:16.379988 2807 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:46:16.387005 kubelet[2807]: I1216 12:46:16.386954 2807 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:46:16.388958 kubelet[2807]: E1216 12:46:16.388928 2807 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:46:16.409965 kubelet[2807]: I1216 12:46:16.409925 2807 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 12:46:16.411588 kubelet[2807]: I1216 12:46:16.411039 2807 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 12:46:16.411588 kubelet[2807]: I1216 12:46:16.411064 2807 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 12:46:16.411588 kubelet[2807]: I1216 12:46:16.411101 2807 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 12:46:16.411588 kubelet[2807]: E1216 12:46:16.411176 2807 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:46:16.440729 kubelet[2807]: I1216 12:46:16.440683 2807 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:46:16.440729 kubelet[2807]: I1216 12:46:16.440704 2807 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:46:16.440729 kubelet[2807]: I1216 12:46:16.440724 2807 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:46:16.440910 kubelet[2807]: I1216 12:46:16.440833 2807 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 12:46:16.440910 kubelet[2807]: I1216 12:46:16.440842 2807 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 12:46:16.440910 kubelet[2807]: I1216 12:46:16.440857 2807 policy_none.go:49] "None policy: Start" Dec 16 12:46:16.440910 kubelet[2807]: I1216 12:46:16.440869 2807 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 12:46:16.440910 kubelet[2807]: I1216 12:46:16.440883 2807 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 12:46:16.441013 kubelet[2807]: I1216 12:46:16.440981 2807 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 16 12:46:16.441013 kubelet[2807]: I1216 12:46:16.440994 2807 policy_none.go:47] "Start" Dec 16 12:46:16.446956 kubelet[2807]: E1216 12:46:16.446910 2807 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:46:16.447163 kubelet[2807]: I1216 12:46:16.447108 2807 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:46:16.447210 kubelet[2807]: I1216 12:46:16.447156 2807 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:46:16.447925 kubelet[2807]: I1216 12:46:16.447783 2807 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:46:16.451277 kubelet[2807]: E1216 12:46:16.451020 2807 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:46:16.512173 kubelet[2807]: I1216 12:46:16.512110 2807 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.512638 kubelet[2807]: I1216 12:46:16.512558 2807 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.512773 kubelet[2807]: I1216 12:46:16.512282 2807 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.521221 kubelet[2807]: E1216 12:46:16.521125 2807 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" already exists" pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.521282 kubelet[2807]: E1216 12:46:16.521246 2807 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-c-452f5360ea\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.552817 kubelet[2807]: I1216 12:46:16.551982 2807 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.562018 kubelet[2807]: I1216 12:46:16.561959 2807 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.562018 kubelet[2807]: I1216 12:46:16.562027 2807 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.579562 kubelet[2807]: I1216 12:46:16.579258 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.579562 kubelet[2807]: I1216 12:46:16.579306 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f4148a05de67f9b028b38d4dee4449f2-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-c-452f5360ea\" (UID: \"f4148a05de67f9b028b38d4dee4449f2\") " pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.579562 kubelet[2807]: I1216 12:46:16.579337 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f4148a05de67f9b028b38d4dee4449f2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-c-452f5360ea\" (UID: \"f4148a05de67f9b028b38d4dee4449f2\") " pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.579562 kubelet[2807]: I1216 12:46:16.579371 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.579562 kubelet[2807]: I1216 12:46:16.579393 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.579839 kubelet[2807]: I1216 12:46:16.579416 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.579839 kubelet[2807]: I1216 12:46:16.579440 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f339c371ed771e712fed1f0ae6cb4bae-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-c-452f5360ea\" (UID: \"f339c371ed771e712fed1f0ae6cb4bae\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.579839 kubelet[2807]: I1216 12:46:16.579464 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/84a7624c597f83bbc4e831003f342640-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-c-452f5360ea\" (UID: \"84a7624c597f83bbc4e831003f342640\") " pod="kube-system/kube-scheduler-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:16.579839 kubelet[2807]: I1216 12:46:16.579487 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f4148a05de67f9b028b38d4dee4449f2-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-c-452f5360ea\" (UID: \"f4148a05de67f9b028b38d4dee4449f2\") " pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:17.338594 kubelet[2807]: I1216 12:46:17.338549 2807 apiserver.go:52] "Watching apiserver" Dec 16 12:46:17.377469 kubelet[2807]: I1216 12:46:17.377403 2807 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 12:46:17.443626 kubelet[2807]: I1216 12:46:17.443594 2807 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:17.454597 kubelet[2807]: E1216 12:46:17.454241 2807 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-c-452f5360ea\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" Dec 16 12:46:17.485723 kubelet[2807]: I1216 12:46:17.485607 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-c-452f5360ea" podStartSLOduration=2.485593811 podStartE2EDuration="2.485593811s" podCreationTimestamp="2025-12-16 12:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:46:17.484905972 +0000 UTC m=+1.235164966" watchObservedRunningTime="2025-12-16 12:46:17.485593811 +0000 UTC m=+1.235852803" Dec 16 12:46:17.520113 kubelet[2807]: I1216 12:46:17.520066 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-c-452f5360ea" podStartSLOduration=2.520049268 podStartE2EDuration="2.520049268s" podCreationTimestamp="2025-12-16 12:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:46:17.49865544 +0000 UTC m=+1.248914433" watchObservedRunningTime="2025-12-16 12:46:17.520049268 +0000 UTC m=+1.270308260" Dec 16 12:46:22.332201 update_engine[1609]: I20251216 12:46:22.332103 1609 update_attempter.cc:509] Updating boot flags... Dec 16 12:46:22.776950 kubelet[2807]: I1216 12:46:22.776831 2807 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 12:46:22.777890 containerd[1624]: time="2025-12-16T12:46:22.777816989Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 12:46:22.778392 kubelet[2807]: I1216 12:46:22.778358 2807 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 12:46:23.914730 kubelet[2807]: I1216 12:46:23.914615 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-c-452f5360ea" podStartSLOduration=7.91458025 podStartE2EDuration="7.91458025s" podCreationTimestamp="2025-12-16 12:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:46:17.521172849 +0000 UTC m=+1.271431842" watchObservedRunningTime="2025-12-16 12:46:23.91458025 +0000 UTC m=+7.664839243" Dec 16 12:46:23.930819 kubelet[2807]: I1216 12:46:23.929308 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ed8a7fdb-18c4-4f1c-8695-a05c26074733-kube-proxy\") pod \"kube-proxy-gspzv\" (UID: \"ed8a7fdb-18c4-4f1c-8695-a05c26074733\") " pod="kube-system/kube-proxy-gspzv" Dec 16 12:46:23.930819 kubelet[2807]: I1216 12:46:23.929360 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ed8a7fdb-18c4-4f1c-8695-a05c26074733-xtables-lock\") pod \"kube-proxy-gspzv\" (UID: \"ed8a7fdb-18c4-4f1c-8695-a05c26074733\") " pod="kube-system/kube-proxy-gspzv" Dec 16 12:46:23.930819 kubelet[2807]: I1216 12:46:23.929430 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed8a7fdb-18c4-4f1c-8695-a05c26074733-lib-modules\") pod \"kube-proxy-gspzv\" (UID: \"ed8a7fdb-18c4-4f1c-8695-a05c26074733\") " pod="kube-system/kube-proxy-gspzv" Dec 16 12:46:23.930819 kubelet[2807]: I1216 12:46:23.930567 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7dr\" (UniqueName: \"kubernetes.io/projected/ed8a7fdb-18c4-4f1c-8695-a05c26074733-kube-api-access-gj7dr\") pod \"kube-proxy-gspzv\" (UID: \"ed8a7fdb-18c4-4f1c-8695-a05c26074733\") " pod="kube-system/kube-proxy-gspzv" Dec 16 12:46:23.937712 systemd[1]: Created slice kubepods-besteffort-poded8a7fdb_18c4_4f1c_8695_a05c26074733.slice - libcontainer container kubepods-besteffort-poded8a7fdb_18c4_4f1c_8695_a05c26074733.slice. Dec 16 12:46:24.046752 systemd[1]: Created slice kubepods-besteffort-podbd5858a0_4001_4920_a75c_e8e96bda999d.slice - libcontainer container kubepods-besteffort-podbd5858a0_4001_4920_a75c_e8e96bda999d.slice. Dec 16 12:46:24.133411 kubelet[2807]: I1216 12:46:24.133312 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/bd5858a0-4001-4920-a75c-e8e96bda999d-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-p8rrl\" (UID: \"bd5858a0-4001-4920-a75c-e8e96bda999d\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-p8rrl" Dec 16 12:46:24.133411 kubelet[2807]: I1216 12:46:24.133365 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h2qs\" (UniqueName: \"kubernetes.io/projected/bd5858a0-4001-4920-a75c-e8e96bda999d-kube-api-access-8h2qs\") pod \"tigera-operator-65cdcdfd6d-p8rrl\" (UID: \"bd5858a0-4001-4920-a75c-e8e96bda999d\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-p8rrl" Dec 16 12:46:24.251349 containerd[1624]: time="2025-12-16T12:46:24.251230419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gspzv,Uid:ed8a7fdb-18c4-4f1c-8695-a05c26074733,Namespace:kube-system,Attempt:0,}" Dec 16 12:46:24.274776 containerd[1624]: time="2025-12-16T12:46:24.274656700Z" level=info msg="connecting to shim 03b25b22ad5adf0365d77bf9b006978863846344fba96b8f518c6114b64940b7" address="unix:///run/containerd/s/94367c503ed14d20af60369d7ae2897d8928c8a570e5850df1c8185b89c92fd9" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:46:24.304747 systemd[1]: Started cri-containerd-03b25b22ad5adf0365d77bf9b006978863846344fba96b8f518c6114b64940b7.scope - libcontainer container 03b25b22ad5adf0365d77bf9b006978863846344fba96b8f518c6114b64940b7. Dec 16 12:46:24.316000 audit: BPF prog-id=133 op=LOAD Dec 16 12:46:24.320414 kernel: kauditd_printk_skb: 32 callbacks suppressed Dec 16 12:46:24.320542 kernel: audit: type=1334 audit(1765889184.316:438): prog-id=133 op=LOAD Dec 16 12:46:24.317000 audit: BPF prog-id=134 op=LOAD Dec 16 12:46:24.325695 kernel: audit: type=1334 audit(1765889184.317:439): prog-id=134 op=LOAD Dec 16 12:46:24.325773 kernel: audit: type=1300 audit(1765889184.317:439): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.317000 audit[2902]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.332236 kernel: audit: type=1327 audit(1765889184.317:439): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.317000 audit: BPF prog-id=134 op=UNLOAD Dec 16 12:46:24.338805 kernel: audit: type=1334 audit(1765889184.317:440): prog-id=134 op=UNLOAD Dec 16 12:46:24.317000 audit[2902]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.342238 kernel: audit: type=1300 audit(1765889184.317:440): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.349089 kernel: audit: type=1327 audit(1765889184.317:440): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.317000 audit: BPF prog-id=135 op=LOAD Dec 16 12:46:24.356117 kernel: audit: type=1334 audit(1765889184.317:441): prog-id=135 op=LOAD Dec 16 12:46:24.356155 kernel: audit: type=1300 audit(1765889184.317:441): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.317000 audit[2902]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.359806 containerd[1624]: time="2025-12-16T12:46:24.359705635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gspzv,Uid:ed8a7fdb-18c4-4f1c-8695-a05c26074733,Namespace:kube-system,Attempt:0,} returns sandbox id \"03b25b22ad5adf0365d77bf9b006978863846344fba96b8f518c6114b64940b7\"" Dec 16 12:46:24.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.365051 kernel: audit: type=1327 audit(1765889184.317:441): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.317000 audit: BPF prog-id=136 op=LOAD Dec 16 12:46:24.317000 audit[2902]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.317000 audit: BPF prog-id=136 op=UNLOAD Dec 16 12:46:24.317000 audit[2902]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.317000 audit: BPF prog-id=135 op=UNLOAD Dec 16 12:46:24.317000 audit[2902]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.317000 audit: BPF prog-id=137 op=LOAD Dec 16 12:46:24.317000 audit[2902]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2891 pid=2902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623235623232616435616466303336356437376266396230303639 Dec 16 12:46:24.372323 containerd[1624]: time="2025-12-16T12:46:24.371755377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-p8rrl,Uid:bd5858a0-4001-4920-a75c-e8e96bda999d,Namespace:tigera-operator,Attempt:0,}" Dec 16 12:46:24.372518 containerd[1624]: time="2025-12-16T12:46:24.372294375Z" level=info msg="CreateContainer within sandbox \"03b25b22ad5adf0365d77bf9b006978863846344fba96b8f518c6114b64940b7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 12:46:24.386004 containerd[1624]: time="2025-12-16T12:46:24.385927213Z" level=info msg="Container 86b7c417df7f414ff09a0f82ae469db89a6ffbc644ef10f3ad85cda309314420: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:46:24.395727 containerd[1624]: time="2025-12-16T12:46:24.395676968Z" level=info msg="CreateContainer within sandbox \"03b25b22ad5adf0365d77bf9b006978863846344fba96b8f518c6114b64940b7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"86b7c417df7f414ff09a0f82ae469db89a6ffbc644ef10f3ad85cda309314420\"" Dec 16 12:46:24.396965 containerd[1624]: time="2025-12-16T12:46:24.396904042Z" level=info msg="StartContainer for \"86b7c417df7f414ff09a0f82ae469db89a6ffbc644ef10f3ad85cda309314420\"" Dec 16 12:46:24.399144 containerd[1624]: time="2025-12-16T12:46:24.399118500Z" level=info msg="connecting to shim 0b1dcdf3a200f24cb7aee7607d6797553e294aea2ef5c50604e6aa85fea57c74" address="unix:///run/containerd/s/db2811244de4879070c05f3deff7168fd40f9d31090d2e4314c4a8480daa10da" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:46:24.401889 containerd[1624]: time="2025-12-16T12:46:24.401858138Z" level=info msg="connecting to shim 86b7c417df7f414ff09a0f82ae469db89a6ffbc644ef10f3ad85cda309314420" address="unix:///run/containerd/s/94367c503ed14d20af60369d7ae2897d8928c8a570e5850df1c8185b89c92fd9" protocol=ttrpc version=3 Dec 16 12:46:24.428866 systemd[1]: Started cri-containerd-86b7c417df7f414ff09a0f82ae469db89a6ffbc644ef10f3ad85cda309314420.scope - libcontainer container 86b7c417df7f414ff09a0f82ae469db89a6ffbc644ef10f3ad85cda309314420. Dec 16 12:46:24.437835 systemd[1]: Started cri-containerd-0b1dcdf3a200f24cb7aee7607d6797553e294aea2ef5c50604e6aa85fea57c74.scope - libcontainer container 0b1dcdf3a200f24cb7aee7607d6797553e294aea2ef5c50604e6aa85fea57c74. Dec 16 12:46:24.457000 audit: BPF prog-id=138 op=LOAD Dec 16 12:46:24.458000 audit: BPF prog-id=139 op=LOAD Dec 16 12:46:24.458000 audit[2954]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2938 pid=2954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.458000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062316463646633613230306632346362376165653736303764363739 Dec 16 12:46:24.458000 audit: BPF prog-id=139 op=UNLOAD Dec 16 12:46:24.458000 audit[2954]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2938 pid=2954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.458000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062316463646633613230306632346362376165653736303764363739 Dec 16 12:46:24.458000 audit: BPF prog-id=140 op=LOAD Dec 16 12:46:24.458000 audit[2954]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2938 pid=2954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.458000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062316463646633613230306632346362376165653736303764363739 Dec 16 12:46:24.458000 audit: BPF prog-id=141 op=LOAD Dec 16 12:46:24.458000 audit[2954]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2938 pid=2954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.458000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062316463646633613230306632346362376165653736303764363739 Dec 16 12:46:24.458000 audit: BPF prog-id=141 op=UNLOAD Dec 16 12:46:24.458000 audit[2954]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2938 pid=2954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.458000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062316463646633613230306632346362376165653736303764363739 Dec 16 12:46:24.458000 audit: BPF prog-id=140 op=UNLOAD Dec 16 12:46:24.458000 audit[2954]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2938 pid=2954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.458000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062316463646633613230306632346362376165653736303764363739 Dec 16 12:46:24.458000 audit: BPF prog-id=142 op=LOAD Dec 16 12:46:24.458000 audit[2954]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2938 pid=2954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.458000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3062316463646633613230306632346362376165653736303764363739 Dec 16 12:46:24.496437 containerd[1624]: time="2025-12-16T12:46:24.496318780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-p8rrl,Uid:bd5858a0-4001-4920-a75c-e8e96bda999d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0b1dcdf3a200f24cb7aee7607d6797553e294aea2ef5c50604e6aa85fea57c74\"" Dec 16 12:46:24.497000 audit: BPF prog-id=143 op=LOAD Dec 16 12:46:24.497000 audit[2942]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2891 pid=2942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623763343137646637663431346666303961306638326165343639 Dec 16 12:46:24.497000 audit: BPF prog-id=144 op=LOAD Dec 16 12:46:24.497000 audit[2942]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2891 pid=2942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623763343137646637663431346666303961306638326165343639 Dec 16 12:46:24.497000 audit: BPF prog-id=144 op=UNLOAD Dec 16 12:46:24.497000 audit[2942]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2891 pid=2942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623763343137646637663431346666303961306638326165343639 Dec 16 12:46:24.497000 audit: BPF prog-id=143 op=UNLOAD Dec 16 12:46:24.497000 audit[2942]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2891 pid=2942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623763343137646637663431346666303961306638326165343639 Dec 16 12:46:24.497000 audit: BPF prog-id=145 op=LOAD Dec 16 12:46:24.497000 audit[2942]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2891 pid=2942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836623763343137646637663431346666303961306638326165343639 Dec 16 12:46:24.500898 containerd[1624]: time="2025-12-16T12:46:24.500765240Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 12:46:24.530918 containerd[1624]: time="2025-12-16T12:46:24.530614293Z" level=info msg="StartContainer for \"86b7c417df7f414ff09a0f82ae469db89a6ffbc644ef10f3ad85cda309314420\" returns successfully" Dec 16 12:46:24.921000 audit[3040]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3040 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:24.921000 audit[3040]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd65decc60 a2=0 a3=7ffd65decc4c items=0 ppid=2972 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.921000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:46:24.922000 audit[3041]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:24.922000 audit[3041]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd313c97d0 a2=0 a3=7ffd313c97bc items=0 ppid=2972 pid=3041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.922000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:46:24.924000 audit[3042]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_chain pid=3042 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:24.924000 audit[3042]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd9a7a0ad0 a2=0 a3=7ffd9a7a0abc items=0 ppid=2972 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.924000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:46:24.925000 audit[3043]: NETFILTER_CFG table=mangle:57 family=10 entries=1 op=nft_register_chain pid=3043 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:24.925000 audit[3043]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcab871fb0 a2=0 a3=7ffcab871f9c items=0 ppid=2972 pid=3043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.925000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:46:24.927000 audit[3044]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3044 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:24.927000 audit[3044]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff6f4a6a80 a2=0 a3=7fff6f4a6a6c items=0 ppid=2972 pid=3044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.927000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:46:24.928000 audit[3047]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3047 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:24.928000 audit[3047]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd8973f4a0 a2=0 a3=7ffd8973f48c items=0 ppid=2972 pid=3047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:24.928000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:46:25.031000 audit[3048]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.031000 audit[3048]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe467a8db0 a2=0 a3=7ffe467a8d9c items=0 ppid=2972 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.031000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:46:25.036000 audit[3050]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.036000 audit[3050]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffcb9cbeb50 a2=0 a3=7ffcb9cbeb3c items=0 ppid=2972 pid=3050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.036000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Dec 16 12:46:25.041000 audit[3053]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.041000 audit[3053]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffe2b403070 a2=0 a3=7ffe2b40305c items=0 ppid=2972 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.041000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 12:46:25.042000 audit[3054]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.042000 audit[3054]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee544fba0 a2=0 a3=7ffee544fb8c items=0 ppid=2972 pid=3054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.042000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:46:25.045000 audit[3056]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.045000 audit[3056]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd948b3250 a2=0 a3=7ffd948b323c items=0 ppid=2972 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.045000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:46:25.046000 audit[3057]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.046000 audit[3057]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe31ad09b0 a2=0 a3=7ffe31ad099c items=0 ppid=2972 pid=3057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.046000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:46:25.049000 audit[3059]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3059 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.049000 audit[3059]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffcea37b590 a2=0 a3=7ffcea37b57c items=0 ppid=2972 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.049000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:46:25.053000 audit[3062]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3062 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.053000 audit[3062]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd7098bd90 a2=0 a3=7ffd7098bd7c items=0 ppid=2972 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.053000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:46:25.054000 audit[3063]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.054000 audit[3063]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd302c15e0 a2=0 a3=7ffd302c15cc items=0 ppid=2972 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.054000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:46:25.057000 audit[3065]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.057000 audit[3065]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffde0938c80 a2=0 a3=7ffde0938c6c items=0 ppid=2972 pid=3065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.057000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:46:25.058000 audit[3066]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.058000 audit[3066]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc93551dc0 a2=0 a3=7ffc93551dac items=0 ppid=2972 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.058000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:46:25.067000 audit[3068]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3068 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.067000 audit[3068]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe4ccacd20 a2=0 a3=7ffe4ccacd0c items=0 ppid=2972 pid=3068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.067000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Dec 16 12:46:25.073000 audit[3071]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3071 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.073000 audit[3071]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff66d44d50 a2=0 a3=7fff66d44d3c items=0 ppid=2972 pid=3071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.073000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 12:46:25.077000 audit[3074]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3074 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.077000 audit[3074]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe2461f8c0 a2=0 a3=7ffe2461f8ac items=0 ppid=2972 pid=3074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.077000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 12:46:25.078000 audit[3075]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3075 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.078000 audit[3075]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff9d308d90 a2=0 a3=7fff9d308d7c items=0 ppid=2972 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.078000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:46:25.082000 audit[3077]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3077 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.082000 audit[3077]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc90206660 a2=0 a3=7ffc9020664c items=0 ppid=2972 pid=3077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.082000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:46:25.085000 audit[3080]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3080 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.085000 audit[3080]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffdfd6563d0 a2=0 a3=7ffdfd6563bc items=0 ppid=2972 pid=3080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.085000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:46:25.086000 audit[3081]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3081 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.086000 audit[3081]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe8198a8b0 a2=0 a3=7ffe8198a89c items=0 ppid=2972 pid=3081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.086000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:46:25.089000 audit[3083]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3083 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:46:25.089000 audit[3083]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffcc6962b50 a2=0 a3=7ffcc6962b3c items=0 ppid=2972 pid=3083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.089000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:46:25.111000 audit[3089]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3089 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:25.111000 audit[3089]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff1447bdd0 a2=0 a3=7fff1447bdbc items=0 ppid=2972 pid=3089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.111000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:25.121000 audit[3089]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3089 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:25.121000 audit[3089]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7fff1447bdd0 a2=0 a3=7fff1447bdbc items=0 ppid=2972 pid=3089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.121000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:25.123000 audit[3094]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.123000 audit[3094]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe633b7560 a2=0 a3=7ffe633b754c items=0 ppid=2972 pid=3094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.123000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:46:25.126000 audit[3096]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.126000 audit[3096]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffd8c4ea230 a2=0 a3=7ffd8c4ea21c items=0 ppid=2972 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.126000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 12:46:25.130000 audit[3099]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.130000 audit[3099]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffd74bb0830 a2=0 a3=7ffd74bb081c items=0 ppid=2972 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.130000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Dec 16 12:46:25.131000 audit[3100]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.131000 audit[3100]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe12279620 a2=0 a3=7ffe1227960c items=0 ppid=2972 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.131000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:46:25.134000 audit[3102]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.134000 audit[3102]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd0df81080 a2=0 a3=7ffd0df8106c items=0 ppid=2972 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.134000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:46:25.135000 audit[3103]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3103 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.135000 audit[3103]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd1047d990 a2=0 a3=7ffd1047d97c items=0 ppid=2972 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.135000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:46:25.138000 audit[3105]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3105 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.138000 audit[3105]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff0d42e230 a2=0 a3=7fff0d42e21c items=0 ppid=2972 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.138000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:46:25.142000 audit[3108]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3108 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.142000 audit[3108]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7fff45203b70 a2=0 a3=7fff45203b5c items=0 ppid=2972 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.142000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:46:25.143000 audit[3109]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.143000 audit[3109]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff8e87e420 a2=0 a3=7fff8e87e40c items=0 ppid=2972 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.143000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:46:25.146000 audit[3111]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3111 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.146000 audit[3111]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc4df72170 a2=0 a3=7ffc4df7215c items=0 ppid=2972 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.146000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:46:25.147000 audit[3112]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.147000 audit[3112]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdc54fb670 a2=0 a3=7ffdc54fb65c items=0 ppid=2972 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.147000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:46:25.150000 audit[3114]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3114 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.150000 audit[3114]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd687d75b0 a2=0 a3=7ffd687d759c items=0 ppid=2972 pid=3114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.150000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 12:46:25.154000 audit[3117]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3117 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.154000 audit[3117]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffa17eca40 a2=0 a3=7fffa17eca2c items=0 ppid=2972 pid=3117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.154000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 12:46:25.158000 audit[3120]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3120 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.158000 audit[3120]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffeb6a54370 a2=0 a3=7ffeb6a5435c items=0 ppid=2972 pid=3120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.158000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Dec 16 12:46:25.159000 audit[3121]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.159000 audit[3121]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffddcbd43d0 a2=0 a3=7ffddcbd43bc items=0 ppid=2972 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.159000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:46:25.161000 audit[3123]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.161000 audit[3123]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fff6caa1240 a2=0 a3=7fff6caa122c items=0 ppid=2972 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.161000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:46:25.165000 audit[3126]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.165000 audit[3126]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc73f93310 a2=0 a3=7ffc73f932fc items=0 ppid=2972 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.165000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:46:25.166000 audit[3127]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.166000 audit[3127]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe0003aad0 a2=0 a3=7ffe0003aabc items=0 ppid=2972 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.166000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:46:25.168000 audit[3129]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3129 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.168000 audit[3129]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7fff356308b0 a2=0 a3=7fff3563089c items=0 ppid=2972 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.168000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:46:25.170000 audit[3130]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.170000 audit[3130]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe861e0640 a2=0 a3=7ffe861e062c items=0 ppid=2972 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.170000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:46:25.178000 audit[3132]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3132 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.178000 audit[3132]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffc2bda89c0 a2=0 a3=7ffc2bda89ac items=0 ppid=2972 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.178000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:46:25.181000 audit[3135]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3135 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:46:25.181000 audit[3135]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffcf3c983d0 a2=0 a3=7ffcf3c983bc items=0 ppid=2972 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.181000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:46:25.184000 audit[3137]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3137 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:46:25.184000 audit[3137]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffc34519fd0 a2=0 a3=7ffc34519fbc items=0 ppid=2972 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.184000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:25.184000 audit[3137]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3137 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:46:25.184000 audit[3137]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffc34519fd0 a2=0 a3=7ffc34519fbc items=0 ppid=2972 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:25.184000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:25.481120 kubelet[2807]: I1216 12:46:25.480722 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gspzv" podStartSLOduration=2.480709363 podStartE2EDuration="2.480709363s" podCreationTimestamp="2025-12-16 12:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:46:25.472412269 +0000 UTC m=+9.222671302" watchObservedRunningTime="2025-12-16 12:46:25.480709363 +0000 UTC m=+9.230968366" Dec 16 12:46:27.172332 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1172304724.mount: Deactivated successfully. Dec 16 12:46:27.580698 containerd[1624]: time="2025-12-16T12:46:27.580622985Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:27.581804 containerd[1624]: time="2025-12-16T12:46:27.581709717Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Dec 16 12:46:27.582580 containerd[1624]: time="2025-12-16T12:46:27.582557435Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:27.584286 containerd[1624]: time="2025-12-16T12:46:27.584249183Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:27.584904 containerd[1624]: time="2025-12-16T12:46:27.584646735Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 3.083795833s" Dec 16 12:46:27.584904 containerd[1624]: time="2025-12-16T12:46:27.584684598Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 16 12:46:27.588958 containerd[1624]: time="2025-12-16T12:46:27.588930161Z" level=info msg="CreateContainer within sandbox \"0b1dcdf3a200f24cb7aee7607d6797553e294aea2ef5c50604e6aa85fea57c74\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 12:46:27.595975 containerd[1624]: time="2025-12-16T12:46:27.595949512Z" level=info msg="Container ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:46:27.601668 containerd[1624]: time="2025-12-16T12:46:27.601630924Z" level=info msg="CreateContainer within sandbox \"0b1dcdf3a200f24cb7aee7607d6797553e294aea2ef5c50604e6aa85fea57c74\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e\"" Dec 16 12:46:27.602109 containerd[1624]: time="2025-12-16T12:46:27.602078264Z" level=info msg="StartContainer for \"ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e\"" Dec 16 12:46:27.603082 containerd[1624]: time="2025-12-16T12:46:27.603064511Z" level=info msg="connecting to shim ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e" address="unix:///run/containerd/s/db2811244de4879070c05f3deff7168fd40f9d31090d2e4314c4a8480daa10da" protocol=ttrpc version=3 Dec 16 12:46:27.623735 systemd[1]: Started cri-containerd-ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e.scope - libcontainer container ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e. Dec 16 12:46:27.634000 audit: BPF prog-id=146 op=LOAD Dec 16 12:46:27.634000 audit: BPF prog-id=147 op=LOAD Dec 16 12:46:27.634000 audit[3146]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2938 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:27.634000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162323161646130373565626137306335313766323430326134666636 Dec 16 12:46:27.635000 audit: BPF prog-id=147 op=UNLOAD Dec 16 12:46:27.635000 audit[3146]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2938 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:27.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162323161646130373565626137306335313766323430326134666636 Dec 16 12:46:27.635000 audit: BPF prog-id=148 op=LOAD Dec 16 12:46:27.635000 audit[3146]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2938 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:27.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162323161646130373565626137306335313766323430326134666636 Dec 16 12:46:27.635000 audit: BPF prog-id=149 op=LOAD Dec 16 12:46:27.635000 audit[3146]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=2938 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:27.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162323161646130373565626137306335313766323430326134666636 Dec 16 12:46:27.635000 audit: BPF prog-id=149 op=UNLOAD Dec 16 12:46:27.635000 audit[3146]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2938 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:27.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162323161646130373565626137306335313766323430326134666636 Dec 16 12:46:27.635000 audit: BPF prog-id=148 op=UNLOAD Dec 16 12:46:27.635000 audit[3146]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2938 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:27.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162323161646130373565626137306335313766323430326134666636 Dec 16 12:46:27.635000 audit: BPF prog-id=150 op=LOAD Dec 16 12:46:27.635000 audit[3146]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=2938 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:27.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162323161646130373565626137306335313766323430326134666636 Dec 16 12:46:27.650666 containerd[1624]: time="2025-12-16T12:46:27.650600608Z" level=info msg="StartContainer for \"ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e\" returns successfully" Dec 16 12:46:28.484336 kubelet[2807]: I1216 12:46:28.484177 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-p8rrl" podStartSLOduration=2.39753604 podStartE2EDuration="5.484156189s" podCreationTimestamp="2025-12-16 12:46:23 +0000 UTC" firstStartedPulling="2025-12-16 12:46:24.49875408 +0000 UTC m=+8.249013073" lastFinishedPulling="2025-12-16 12:46:27.585374229 +0000 UTC m=+11.335633222" observedRunningTime="2025-12-16 12:46:28.483265732 +0000 UTC m=+12.233524735" watchObservedRunningTime="2025-12-16 12:46:28.484156189 +0000 UTC m=+12.234415192" Dec 16 12:46:33.711470 sudo[1876]: pam_unix(sudo:session): session closed for user root Dec 16 12:46:33.715571 kernel: kauditd_printk_skb: 224 callbacks suppressed Dec 16 12:46:33.715659 kernel: audit: type=1106 audit(1765889193.710:518): pid=1876 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:46:33.710000 audit[1876]: USER_END pid=1876 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:46:33.710000 audit[1876]: CRED_DISP pid=1876 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:46:33.727565 kernel: audit: type=1104 audit(1765889193.710:519): pid=1876 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:46:33.870057 sshd[1875]: Connection closed by 147.75.109.163 port 47784 Dec 16 12:46:33.870579 sshd-session[1871]: pam_unix(sshd:session): session closed for user core Dec 16 12:46:33.881546 kernel: audit: type=1106 audit(1765889193.872:520): pid=1871 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:46:33.872000 audit[1871]: USER_END pid=1871 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:46:33.886352 systemd[1]: sshd@6-77.42.23.34:22-147.75.109.163:47784.service: Deactivated successfully. Dec 16 12:46:33.888595 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 12:46:33.888775 systemd[1]: session-8.scope: Consumed 4.421s CPU time, 166.1M memory peak. Dec 16 12:46:33.891749 systemd-logind[1607]: Session 8 logged out. Waiting for processes to exit. Dec 16 12:46:33.894376 systemd-logind[1607]: Removed session 8. Dec 16 12:46:33.881000 audit[1871]: CRED_DISP pid=1871 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:46:33.900555 kernel: audit: type=1104 audit(1765889193.881:521): pid=1871 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:46:33.885000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.23.34:22-147.75.109.163:47784 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:33.912248 kernel: audit: type=1131 audit(1765889193.885:522): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.23.34:22-147.75.109.163:47784 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:46:34.699000 audit[3226]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:34.706144 kernel: audit: type=1325 audit(1765889194.699:523): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:34.699000 audit[3226]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffccff8a510 a2=0 a3=7ffccff8a4fc items=0 ppid=2972 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:34.715632 kernel: audit: type=1300 audit(1765889194.699:523): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffccff8a510 a2=0 a3=7ffccff8a4fc items=0 ppid=2972 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:34.699000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:34.720576 kernel: audit: type=1327 audit(1765889194.699:523): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:34.715000 audit[3226]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:34.724541 kernel: audit: type=1325 audit(1765889194.715:524): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:34.715000 audit[3226]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffccff8a510 a2=0 a3=0 items=0 ppid=2972 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:34.732563 kernel: audit: type=1300 audit(1765889194.715:524): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffccff8a510 a2=0 a3=0 items=0 ppid=2972 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:34.715000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:35.778000 audit[3228]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3228 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:35.778000 audit[3228]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffce8c11c60 a2=0 a3=7ffce8c11c4c items=0 ppid=2972 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:35.778000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:35.794000 audit[3228]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3228 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:35.794000 audit[3228]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffce8c11c60 a2=0 a3=0 items=0 ppid=2972 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:35.794000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:36.819000 audit[3230]: NETFILTER_CFG table=filter:109 family=2 entries=18 op=nft_register_rule pid=3230 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:36.819000 audit[3230]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fff112b7d40 a2=0 a3=7fff112b7d2c items=0 ppid=2972 pid=3230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:36.819000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:36.825000 audit[3230]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3230 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:36.825000 audit[3230]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff112b7d40 a2=0 a3=0 items=0 ppid=2972 pid=3230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:36.825000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:38.398000 audit[3232]: NETFILTER_CFG table=filter:111 family=2 entries=21 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:38.398000 audit[3232]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe272def90 a2=0 a3=7ffe272def7c items=0 ppid=2972 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.398000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:38.403000 audit[3232]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:38.403000 audit[3232]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe272def90 a2=0 a3=0 items=0 ppid=2972 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.403000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:38.443139 systemd[1]: Created slice kubepods-besteffort-pod1f4fe4c5_3364_4602_bf63_3dcb7b385529.slice - libcontainer container kubepods-besteffort-pod1f4fe4c5_3364_4602_bf63_3dcb7b385529.slice. Dec 16 12:46:38.531561 kubelet[2807]: I1216 12:46:38.531491 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9tt7\" (UniqueName: \"kubernetes.io/projected/1f4fe4c5-3364-4602-bf63-3dcb7b385529-kube-api-access-c9tt7\") pod \"calico-typha-5547cc86fd-bfvwn\" (UID: \"1f4fe4c5-3364-4602-bf63-3dcb7b385529\") " pod="calico-system/calico-typha-5547cc86fd-bfvwn" Dec 16 12:46:38.531561 kubelet[2807]: I1216 12:46:38.531563 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f4fe4c5-3364-4602-bf63-3dcb7b385529-tigera-ca-bundle\") pod \"calico-typha-5547cc86fd-bfvwn\" (UID: \"1f4fe4c5-3364-4602-bf63-3dcb7b385529\") " pod="calico-system/calico-typha-5547cc86fd-bfvwn" Dec 16 12:46:38.531953 kubelet[2807]: I1216 12:46:38.531583 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/1f4fe4c5-3364-4602-bf63-3dcb7b385529-typha-certs\") pod \"calico-typha-5547cc86fd-bfvwn\" (UID: \"1f4fe4c5-3364-4602-bf63-3dcb7b385529\") " pod="calico-system/calico-typha-5547cc86fd-bfvwn" Dec 16 12:46:38.616161 systemd[1]: Created slice kubepods-besteffort-pod5a663a37_9473_4671_a0c0_fcbe5cb82e9f.slice - libcontainer container kubepods-besteffort-pod5a663a37_9473_4671_a0c0_fcbe5cb82e9f.slice. Dec 16 12:46:38.632630 kubelet[2807]: I1216 12:46:38.632593 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-xtables-lock\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632739 kubelet[2807]: I1216 12:46:38.632649 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-policysync\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632739 kubelet[2807]: I1216 12:46:38.632666 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxb2\" (UniqueName: \"kubernetes.io/projected/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-kube-api-access-xhxb2\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632739 kubelet[2807]: I1216 12:46:38.632686 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-cni-log-dir\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632739 kubelet[2807]: I1216 12:46:38.632700 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-flexvol-driver-host\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632739 kubelet[2807]: I1216 12:46:38.632714 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-tigera-ca-bundle\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632838 kubelet[2807]: I1216 12:46:38.632727 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-var-lib-calico\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632838 kubelet[2807]: I1216 12:46:38.632740 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-cni-net-dir\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632838 kubelet[2807]: I1216 12:46:38.632755 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-cni-bin-dir\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632838 kubelet[2807]: I1216 12:46:38.632784 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-node-certs\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.632838 kubelet[2807]: I1216 12:46:38.632796 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-var-run-calico\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.633214 kubelet[2807]: I1216 12:46:38.632808 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a663a37-9473-4671-a0c0-fcbe5cb82e9f-lib-modules\") pod \"calico-node-v4brf\" (UID: \"5a663a37-9473-4671-a0c0-fcbe5cb82e9f\") " pod="calico-system/calico-node-v4brf" Dec 16 12:46:38.759722 kubelet[2807]: E1216 12:46:38.759594 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.759722 kubelet[2807]: W1216 12:46:38.759643 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.759722 kubelet[2807]: E1216 12:46:38.759676 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.761451 kubelet[2807]: E1216 12:46:38.761409 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.761451 kubelet[2807]: W1216 12:46:38.761442 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.763575 kubelet[2807]: E1216 12:46:38.763418 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.767149 containerd[1624]: time="2025-12-16T12:46:38.766655947Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5547cc86fd-bfvwn,Uid:1f4fe4c5-3364-4602-bf63-3dcb7b385529,Namespace:calico-system,Attempt:0,}" Dec 16 12:46:38.820019 containerd[1624]: time="2025-12-16T12:46:38.819719221Z" level=info msg="connecting to shim 307670884ca008818e6a08b098ff2dc8c4d20bb5599e5cb73f38e48f698c4ca6" address="unix:///run/containerd/s/85c32dca12e72a3d65e471ae396f57ac44861e5bd9e7a50e75ab515c06311744" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:46:38.856715 systemd[1]: Started cri-containerd-307670884ca008818e6a08b098ff2dc8c4d20bb5599e5cb73f38e48f698c4ca6.scope - libcontainer container 307670884ca008818e6a08b098ff2dc8c4d20bb5599e5cb73f38e48f698c4ca6. Dec 16 12:46:38.871581 kubelet[2807]: E1216 12:46:38.870621 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:46:38.889353 kernel: kauditd_printk_skb: 19 callbacks suppressed Dec 16 12:46:38.889445 kernel: audit: type=1334 audit(1765889198.884:531): prog-id=151 op=LOAD Dec 16 12:46:38.884000 audit: BPF prog-id=151 op=LOAD Dec 16 12:46:38.885000 audit: BPF prog-id=152 op=LOAD Dec 16 12:46:38.894548 kernel: audit: type=1334 audit(1765889198.885:532): prog-id=152 op=LOAD Dec 16 12:46:38.885000 audit[3259]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.902545 kernel: audit: type=1300 audit(1765889198.885:532): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.885000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.914689 kernel: audit: type=1327 audit(1765889198.885:532): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.920345 kernel: audit: type=1334 audit(1765889198.886:533): prog-id=152 op=UNLOAD Dec 16 12:46:38.886000 audit: BPF prog-id=152 op=UNLOAD Dec 16 12:46:38.920437 kubelet[2807]: E1216 12:46:38.916991 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920437 kubelet[2807]: W1216 12:46:38.917007 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920437 kubelet[2807]: E1216 12:46:38.917044 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920437 kubelet[2807]: E1216 12:46:38.917178 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920437 kubelet[2807]: W1216 12:46:38.917201 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920437 kubelet[2807]: E1216 12:46:38.917210 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920437 kubelet[2807]: E1216 12:46:38.917335 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920437 kubelet[2807]: W1216 12:46:38.917341 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920437 kubelet[2807]: E1216 12:46:38.917349 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920437 kubelet[2807]: E1216 12:46:38.917543 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920648 kubelet[2807]: W1216 12:46:38.917552 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920648 kubelet[2807]: E1216 12:46:38.917559 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920648 kubelet[2807]: E1216 12:46:38.917687 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920648 kubelet[2807]: W1216 12:46:38.917694 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920648 kubelet[2807]: E1216 12:46:38.917717 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920648 kubelet[2807]: E1216 12:46:38.917906 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920648 kubelet[2807]: W1216 12:46:38.917913 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920648 kubelet[2807]: E1216 12:46:38.917920 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920648 kubelet[2807]: E1216 12:46:38.918052 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920648 kubelet[2807]: W1216 12:46:38.918060 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920800 kubelet[2807]: E1216 12:46:38.918067 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920800 kubelet[2807]: E1216 12:46:38.918186 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920800 kubelet[2807]: W1216 12:46:38.918193 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920800 kubelet[2807]: E1216 12:46:38.918199 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920800 kubelet[2807]: E1216 12:46:38.918334 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920800 kubelet[2807]: W1216 12:46:38.918342 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920800 kubelet[2807]: E1216 12:46:38.918348 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920800 kubelet[2807]: E1216 12:46:38.918451 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920800 kubelet[2807]: W1216 12:46:38.918458 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920800 kubelet[2807]: E1216 12:46:38.918464 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920963 kubelet[2807]: E1216 12:46:38.918582 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920963 kubelet[2807]: W1216 12:46:38.918588 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920963 kubelet[2807]: E1216 12:46:38.918594 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920963 kubelet[2807]: E1216 12:46:38.918702 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920963 kubelet[2807]: W1216 12:46:38.918737 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920963 kubelet[2807]: E1216 12:46:38.918756 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920963 kubelet[2807]: E1216 12:46:38.918970 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.920963 kubelet[2807]: W1216 12:46:38.918977 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.920963 kubelet[2807]: E1216 12:46:38.918984 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.920963 kubelet[2807]: E1216 12:46:38.919093 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.921111 kubelet[2807]: W1216 12:46:38.919112 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.921111 kubelet[2807]: E1216 12:46:38.919123 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.921111 kubelet[2807]: E1216 12:46:38.919229 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.921111 kubelet[2807]: W1216 12:46:38.919235 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.921111 kubelet[2807]: E1216 12:46:38.919241 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.921111 kubelet[2807]: E1216 12:46:38.919372 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.921111 kubelet[2807]: W1216 12:46:38.919381 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.921111 kubelet[2807]: E1216 12:46:38.919387 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.921111 kubelet[2807]: E1216 12:46:38.919508 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.921111 kubelet[2807]: W1216 12:46:38.919514 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.921260 kubelet[2807]: E1216 12:46:38.919566 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.921260 kubelet[2807]: E1216 12:46:38.919714 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.921260 kubelet[2807]: W1216 12:46:38.919726 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.921260 kubelet[2807]: E1216 12:46:38.919742 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.921260 kubelet[2807]: E1216 12:46:38.919940 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.921260 kubelet[2807]: W1216 12:46:38.919947 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.921260 kubelet[2807]: E1216 12:46:38.919954 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.921260 kubelet[2807]: E1216 12:46:38.920093 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.921260 kubelet[2807]: W1216 12:46:38.920099 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.921260 kubelet[2807]: E1216 12:46:38.920105 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.924563 containerd[1624]: time="2025-12-16T12:46:38.922872680Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v4brf,Uid:5a663a37-9473-4671-a0c0-fcbe5cb82e9f,Namespace:calico-system,Attempt:0,}" Dec 16 12:46:38.932587 kernel: audit: type=1300 audit(1765889198.886:533): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.886000 audit[3259]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.941613 kernel: audit: type=1327 audit(1765889198.886:533): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.941750 kubelet[2807]: E1216 12:46:38.935605 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.941750 kubelet[2807]: W1216 12:46:38.935633 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.941750 kubelet[2807]: E1216 12:46:38.935650 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.941750 kubelet[2807]: I1216 12:46:38.935673 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eff3d741-729e-4ed9-a6a3-d314f99d7c29-kubelet-dir\") pod \"csi-node-driver-h22qr\" (UID: \"eff3d741-729e-4ed9-a6a3-d314f99d7c29\") " pod="calico-system/csi-node-driver-h22qr" Dec 16 12:46:38.941750 kubelet[2807]: E1216 12:46:38.936271 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.941750 kubelet[2807]: W1216 12:46:38.936281 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.941750 kubelet[2807]: E1216 12:46:38.936309 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.941750 kubelet[2807]: I1216 12:46:38.936333 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eff3d741-729e-4ed9-a6a3-d314f99d7c29-registration-dir\") pod \"csi-node-driver-h22qr\" (UID: \"eff3d741-729e-4ed9-a6a3-d314f99d7c29\") " pod="calico-system/csi-node-driver-h22qr" Dec 16 12:46:38.941750 kubelet[2807]: E1216 12:46:38.936554 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.941964 kubelet[2807]: W1216 12:46:38.936562 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.941964 kubelet[2807]: E1216 12:46:38.936570 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.941964 kubelet[2807]: I1216 12:46:38.936646 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/eff3d741-729e-4ed9-a6a3-d314f99d7c29-varrun\") pod \"csi-node-driver-h22qr\" (UID: \"eff3d741-729e-4ed9-a6a3-d314f99d7c29\") " pod="calico-system/csi-node-driver-h22qr" Dec 16 12:46:38.941964 kubelet[2807]: E1216 12:46:38.937893 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.941964 kubelet[2807]: W1216 12:46:38.937903 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.941964 kubelet[2807]: E1216 12:46:38.937911 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.941964 kubelet[2807]: E1216 12:46:38.938193 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.941964 kubelet[2807]: W1216 12:46:38.938202 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.941964 kubelet[2807]: E1216 12:46:38.938209 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.942158 kubelet[2807]: E1216 12:46:38.938410 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.942158 kubelet[2807]: W1216 12:46:38.938417 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.942158 kubelet[2807]: E1216 12:46:38.938424 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.942158 kubelet[2807]: E1216 12:46:38.938648 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.942158 kubelet[2807]: W1216 12:46:38.938656 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.942158 kubelet[2807]: E1216 12:46:38.938663 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.942158 kubelet[2807]: E1216 12:46:38.939140 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.942158 kubelet[2807]: W1216 12:46:38.939148 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.942158 kubelet[2807]: E1216 12:46:38.939156 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.942308 kubelet[2807]: I1216 12:46:38.939210 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eff3d741-729e-4ed9-a6a3-d314f99d7c29-socket-dir\") pod \"csi-node-driver-h22qr\" (UID: \"eff3d741-729e-4ed9-a6a3-d314f99d7c29\") " pod="calico-system/csi-node-driver-h22qr" Dec 16 12:46:38.942308 kubelet[2807]: E1216 12:46:38.939751 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.942308 kubelet[2807]: W1216 12:46:38.939763 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.942308 kubelet[2807]: E1216 12:46:38.940090 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.942308 kubelet[2807]: I1216 12:46:38.940106 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m799h\" (UniqueName: \"kubernetes.io/projected/eff3d741-729e-4ed9-a6a3-d314f99d7c29-kube-api-access-m799h\") pod \"csi-node-driver-h22qr\" (UID: \"eff3d741-729e-4ed9-a6a3-d314f99d7c29\") " pod="calico-system/csi-node-driver-h22qr" Dec 16 12:46:38.942308 kubelet[2807]: E1216 12:46:38.941440 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.942308 kubelet[2807]: W1216 12:46:38.941451 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.942308 kubelet[2807]: E1216 12:46:38.941463 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.942308 kubelet[2807]: E1216 12:46:38.942083 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.886000 audit: BPF prog-id=153 op=LOAD Dec 16 12:46:38.945089 kubelet[2807]: W1216 12:46:38.942092 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.945089 kubelet[2807]: E1216 12:46:38.942102 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.945089 kubelet[2807]: E1216 12:46:38.943087 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.945089 kubelet[2807]: W1216 12:46:38.943095 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.945089 kubelet[2807]: E1216 12:46:38.943104 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.945089 kubelet[2807]: E1216 12:46:38.943466 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.945089 kubelet[2807]: W1216 12:46:38.943477 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.945089 kubelet[2807]: E1216 12:46:38.943606 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.945089 kubelet[2807]: E1216 12:46:38.944363 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.945089 kubelet[2807]: W1216 12:46:38.944371 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.945242 kubelet[2807]: E1216 12:46:38.944379 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.945242 kubelet[2807]: E1216 12:46:38.944830 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:38.945242 kubelet[2807]: W1216 12:46:38.944838 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:38.945242 kubelet[2807]: E1216 12:46:38.944846 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:38.953020 kernel: audit: type=1334 audit(1765889198.886:534): prog-id=153 op=LOAD Dec 16 12:46:38.953077 kernel: audit: type=1300 audit(1765889198.886:534): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.886000 audit[3259]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.959459 kernel: audit: type=1327 audit(1765889198.886:534): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.886000 audit: BPF prog-id=154 op=LOAD Dec 16 12:46:38.886000 audit[3259]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.886000 audit: BPF prog-id=154 op=UNLOAD Dec 16 12:46:38.886000 audit[3259]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.886000 audit: BPF prog-id=153 op=UNLOAD Dec 16 12:46:38.886000 audit[3259]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.886000 audit: BPF prog-id=155 op=LOAD Dec 16 12:46:38.886000 audit[3259]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3248 pid=3259 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:38.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330373637303838346361303038383138653661303862303938666632 Dec 16 12:46:38.965247 containerd[1624]: time="2025-12-16T12:46:38.964964071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5547cc86fd-bfvwn,Uid:1f4fe4c5-3364-4602-bf63-3dcb7b385529,Namespace:calico-system,Attempt:0,} returns sandbox id \"307670884ca008818e6a08b098ff2dc8c4d20bb5599e5cb73f38e48f698c4ca6\"" Dec 16 12:46:38.968996 containerd[1624]: time="2025-12-16T12:46:38.968968684Z" level=info msg="connecting to shim ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33" address="unix:///run/containerd/s/f7abee2d438b28864fc602dda003b25158d9b4ac834efe312088c676cae39fb5" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:46:38.970813 containerd[1624]: time="2025-12-16T12:46:38.970665009Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 12:46:38.998725 systemd[1]: Started cri-containerd-ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33.scope - libcontainer container ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33. Dec 16 12:46:39.007000 audit: BPF prog-id=156 op=LOAD Dec 16 12:46:39.007000 audit: BPF prog-id=157 op=LOAD Dec 16 12:46:39.007000 audit[3351]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3339 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:39.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261356137643430646533373831643232383834386261363232316132 Dec 16 12:46:39.007000 audit: BPF prog-id=157 op=UNLOAD Dec 16 12:46:39.007000 audit[3351]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:39.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261356137643430646533373831643232383834386261363232316132 Dec 16 12:46:39.007000 audit: BPF prog-id=158 op=LOAD Dec 16 12:46:39.007000 audit[3351]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3339 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:39.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261356137643430646533373831643232383834386261363232316132 Dec 16 12:46:39.007000 audit: BPF prog-id=159 op=LOAD Dec 16 12:46:39.007000 audit[3351]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3339 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:39.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261356137643430646533373831643232383834386261363232316132 Dec 16 12:46:39.008000 audit: BPF prog-id=159 op=UNLOAD Dec 16 12:46:39.008000 audit[3351]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:39.008000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261356137643430646533373831643232383834386261363232316132 Dec 16 12:46:39.008000 audit: BPF prog-id=158 op=UNLOAD Dec 16 12:46:39.008000 audit[3351]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:39.008000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261356137643430646533373831643232383834386261363232316132 Dec 16 12:46:39.008000 audit: BPF prog-id=160 op=LOAD Dec 16 12:46:39.008000 audit[3351]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3339 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:39.008000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261356137643430646533373831643232383834386261363232316132 Dec 16 12:46:39.029186 containerd[1624]: time="2025-12-16T12:46:39.025760665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v4brf,Uid:5a663a37-9473-4671-a0c0-fcbe5cb82e9f,Namespace:calico-system,Attempt:0,} returns sandbox id \"ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33\"" Dec 16 12:46:39.041075 kubelet[2807]: E1216 12:46:39.041016 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.043574 kubelet[2807]: W1216 12:46:39.041062 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.043862 kubelet[2807]: E1216 12:46:39.043589 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.044270 kubelet[2807]: E1216 12:46:39.044143 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.044270 kubelet[2807]: W1216 12:46:39.044154 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.044632 kubelet[2807]: E1216 12:46:39.044167 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.045076 kubelet[2807]: E1216 12:46:39.045047 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.045263 kubelet[2807]: W1216 12:46:39.045236 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.045263 kubelet[2807]: E1216 12:46:39.045253 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.045722 kubelet[2807]: E1216 12:46:39.045605 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.045722 kubelet[2807]: W1216 12:46:39.045666 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.045722 kubelet[2807]: E1216 12:46:39.045678 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.046438 kubelet[2807]: E1216 12:46:39.046429 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.046602 kubelet[2807]: W1216 12:46:39.046504 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.046602 kubelet[2807]: E1216 12:46:39.046519 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.047209 kubelet[2807]: E1216 12:46:39.047199 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.047313 kubelet[2807]: W1216 12:46:39.047257 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.047550 kubelet[2807]: E1216 12:46:39.047466 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.048116 kubelet[2807]: E1216 12:46:39.047991 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.048116 kubelet[2807]: W1216 12:46:39.048094 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.048116 kubelet[2807]: E1216 12:46:39.048105 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.048728 kubelet[2807]: E1216 12:46:39.048715 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.048806 kubelet[2807]: W1216 12:46:39.048778 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.048806 kubelet[2807]: E1216 12:46:39.048789 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.049296 kubelet[2807]: E1216 12:46:39.049233 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.049435 kubelet[2807]: W1216 12:46:39.049339 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.049435 kubelet[2807]: E1216 12:46:39.049350 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.049969 kubelet[2807]: E1216 12:46:39.049943 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.049969 kubelet[2807]: W1216 12:46:39.049952 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.049969 kubelet[2807]: E1216 12:46:39.049959 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.050556 kubelet[2807]: E1216 12:46:39.050453 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.050659 kubelet[2807]: W1216 12:46:39.050637 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.050659 kubelet[2807]: E1216 12:46:39.050650 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.051579 kubelet[2807]: E1216 12:46:39.051569 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.051662 kubelet[2807]: W1216 12:46:39.051633 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.051662 kubelet[2807]: E1216 12:46:39.051645 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.051895 kubelet[2807]: E1216 12:46:39.051859 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.052048 kubelet[2807]: W1216 12:46:39.052037 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.052191 kubelet[2807]: E1216 12:46:39.052181 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.052642 kubelet[2807]: E1216 12:46:39.052621 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.052972 kubelet[2807]: W1216 12:46:39.052807 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.052972 kubelet[2807]: E1216 12:46:39.052822 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.053520 kubelet[2807]: E1216 12:46:39.053396 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.053520 kubelet[2807]: W1216 12:46:39.053414 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.053520 kubelet[2807]: E1216 12:46:39.053423 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.054090 kubelet[2807]: E1216 12:46:39.053981 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.054235 kubelet[2807]: W1216 12:46:39.054223 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.054381 kubelet[2807]: E1216 12:46:39.054370 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.054821 kubelet[2807]: E1216 12:46:39.054723 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.055081 kubelet[2807]: W1216 12:46:39.054983 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.055081 kubelet[2807]: E1216 12:46:39.054997 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.055648 kubelet[2807]: E1216 12:46:39.055578 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.055648 kubelet[2807]: W1216 12:46:39.055587 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.055648 kubelet[2807]: E1216 12:46:39.055595 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.056163 kubelet[2807]: E1216 12:46:39.056062 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.056163 kubelet[2807]: W1216 12:46:39.056088 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.056163 kubelet[2807]: E1216 12:46:39.056096 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.056719 kubelet[2807]: E1216 12:46:39.056609 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.056937 kubelet[2807]: W1216 12:46:39.056778 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.056937 kubelet[2807]: E1216 12:46:39.056794 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.057515 kubelet[2807]: E1216 12:46:39.057338 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.057515 kubelet[2807]: W1216 12:46:39.057357 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.057515 kubelet[2807]: E1216 12:46:39.057366 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.058344 kubelet[2807]: E1216 12:46:39.058334 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.058443 kubelet[2807]: W1216 12:46:39.058390 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.058443 kubelet[2807]: E1216 12:46:39.058402 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.059448 kubelet[2807]: E1216 12:46:39.059215 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.059448 kubelet[2807]: W1216 12:46:39.059227 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.059448 kubelet[2807]: E1216 12:46:39.059236 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.059770 kubelet[2807]: E1216 12:46:39.059751 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.059943 kubelet[2807]: W1216 12:46:39.059885 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.060091 kubelet[2807]: E1216 12:46:39.060076 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.060704 kubelet[2807]: E1216 12:46:39.060669 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.060704 kubelet[2807]: W1216 12:46:39.060678 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.060704 kubelet[2807]: E1216 12:46:39.060686 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.077561 kubelet[2807]: E1216 12:46:39.077474 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:39.077561 kubelet[2807]: W1216 12:46:39.077491 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:39.077561 kubelet[2807]: E1216 12:46:39.077506 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:39.429000 audit[3405]: NETFILTER_CFG table=filter:113 family=2 entries=22 op=nft_register_rule pid=3405 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:39.429000 audit[3405]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffd2e774fd0 a2=0 a3=7ffd2e774fbc items=0 ppid=2972 pid=3405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:39.429000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:39.434000 audit[3405]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3405 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:46:39.434000 audit[3405]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd2e774fd0 a2=0 a3=0 items=0 ppid=2972 pid=3405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:39.434000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:46:40.413524 kubelet[2807]: E1216 12:46:40.412378 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:46:41.012787 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2072199647.mount: Deactivated successfully. Dec 16 12:46:42.294689 containerd[1624]: time="2025-12-16T12:46:42.294630241Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:42.295182 containerd[1624]: time="2025-12-16T12:46:42.295146175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Dec 16 12:46:42.295573 containerd[1624]: time="2025-12-16T12:46:42.295408756Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:42.297110 containerd[1624]: time="2025-12-16T12:46:42.297081234Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:42.297693 containerd[1624]: time="2025-12-16T12:46:42.297665928Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 3.326844091s" Dec 16 12:46:42.297787 containerd[1624]: time="2025-12-16T12:46:42.297770488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 16 12:46:42.299173 containerd[1624]: time="2025-12-16T12:46:42.299140761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 12:46:42.318781 containerd[1624]: time="2025-12-16T12:46:42.318736191Z" level=info msg="CreateContainer within sandbox \"307670884ca008818e6a08b098ff2dc8c4d20bb5599e5cb73f38e48f698c4ca6\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 12:46:42.327038 containerd[1624]: time="2025-12-16T12:46:42.326991621Z" level=info msg="Container 9bea5f11f37550fd10ffc208c29409d7d27646962a3de0e6c7dc7fce19fd85e8: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:46:42.344819 containerd[1624]: time="2025-12-16T12:46:42.344776549Z" level=info msg="CreateContainer within sandbox \"307670884ca008818e6a08b098ff2dc8c4d20bb5599e5cb73f38e48f698c4ca6\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9bea5f11f37550fd10ffc208c29409d7d27646962a3de0e6c7dc7fce19fd85e8\"" Dec 16 12:46:42.345637 containerd[1624]: time="2025-12-16T12:46:42.345505559Z" level=info msg="StartContainer for \"9bea5f11f37550fd10ffc208c29409d7d27646962a3de0e6c7dc7fce19fd85e8\"" Dec 16 12:46:42.347139 containerd[1624]: time="2025-12-16T12:46:42.347083427Z" level=info msg="connecting to shim 9bea5f11f37550fd10ffc208c29409d7d27646962a3de0e6c7dc7fce19fd85e8" address="unix:///run/containerd/s/85c32dca12e72a3d65e471ae396f57ac44861e5bd9e7a50e75ab515c06311744" protocol=ttrpc version=3 Dec 16 12:46:42.376696 systemd[1]: Started cri-containerd-9bea5f11f37550fd10ffc208c29409d7d27646962a3de0e6c7dc7fce19fd85e8.scope - libcontainer container 9bea5f11f37550fd10ffc208c29409d7d27646962a3de0e6c7dc7fce19fd85e8. Dec 16 12:46:42.392000 audit: BPF prog-id=161 op=LOAD Dec 16 12:46:42.392000 audit: BPF prog-id=162 op=LOAD Dec 16 12:46:42.392000 audit[3416]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3248 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:42.392000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962656135663131663337353530666431306666633230386332393430 Dec 16 12:46:42.393000 audit: BPF prog-id=162 op=UNLOAD Dec 16 12:46:42.393000 audit[3416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3248 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:42.393000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962656135663131663337353530666431306666633230386332393430 Dec 16 12:46:42.393000 audit: BPF prog-id=163 op=LOAD Dec 16 12:46:42.393000 audit[3416]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3248 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:42.393000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962656135663131663337353530666431306666633230386332393430 Dec 16 12:46:42.393000 audit: BPF prog-id=164 op=LOAD Dec 16 12:46:42.393000 audit[3416]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3248 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:42.393000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962656135663131663337353530666431306666633230386332393430 Dec 16 12:46:42.393000 audit: BPF prog-id=164 op=UNLOAD Dec 16 12:46:42.393000 audit[3416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3248 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:42.393000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962656135663131663337353530666431306666633230386332393430 Dec 16 12:46:42.393000 audit: BPF prog-id=163 op=UNLOAD Dec 16 12:46:42.393000 audit[3416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3248 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:42.393000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962656135663131663337353530666431306666633230386332393430 Dec 16 12:46:42.393000 audit: BPF prog-id=165 op=LOAD Dec 16 12:46:42.393000 audit[3416]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3248 pid=3416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:42.393000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962656135663131663337353530666431306666633230386332393430 Dec 16 12:46:42.419929 kubelet[2807]: E1216 12:46:42.419394 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:46:42.440783 containerd[1624]: time="2025-12-16T12:46:42.440625213Z" level=info msg="StartContainer for \"9bea5f11f37550fd10ffc208c29409d7d27646962a3de0e6c7dc7fce19fd85e8\" returns successfully" Dec 16 12:46:42.548428 kubelet[2807]: E1216 12:46:42.548227 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.548428 kubelet[2807]: W1216 12:46:42.548253 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.549241 kubelet[2807]: E1216 12:46:42.548907 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.549618 kubelet[2807]: E1216 12:46:42.549601 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.549618 kubelet[2807]: W1216 12:46:42.549610 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.549618 kubelet[2807]: E1216 12:46:42.549623 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.549808 kubelet[2807]: E1216 12:46:42.549731 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.549808 kubelet[2807]: W1216 12:46:42.549738 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.549808 kubelet[2807]: E1216 12:46:42.549744 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.552603 kubelet[2807]: E1216 12:46:42.552587 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.552603 kubelet[2807]: W1216 12:46:42.552601 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.552603 kubelet[2807]: E1216 12:46:42.552611 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.552747 kubelet[2807]: E1216 12:46:42.552727 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.552747 kubelet[2807]: W1216 12:46:42.552738 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.552747 kubelet[2807]: E1216 12:46:42.552746 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.553647 kubelet[2807]: E1216 12:46:42.553632 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.553647 kubelet[2807]: W1216 12:46:42.553644 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.553722 kubelet[2807]: E1216 12:46:42.553652 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.553771 kubelet[2807]: E1216 12:46:42.553754 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.553771 kubelet[2807]: W1216 12:46:42.553760 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.553771 kubelet[2807]: E1216 12:46:42.553766 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.554436 kubelet[2807]: E1216 12:46:42.553867 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.554436 kubelet[2807]: W1216 12:46:42.553875 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.554436 kubelet[2807]: E1216 12:46:42.553881 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.554436 kubelet[2807]: E1216 12:46:42.553984 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.554436 kubelet[2807]: W1216 12:46:42.553990 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.554436 kubelet[2807]: E1216 12:46:42.553996 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.554436 kubelet[2807]: E1216 12:46:42.554098 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.554436 kubelet[2807]: W1216 12:46:42.554105 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.554436 kubelet[2807]: E1216 12:46:42.554110 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.554436 kubelet[2807]: E1216 12:46:42.554198 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.554635 kubelet[2807]: W1216 12:46:42.554204 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.554635 kubelet[2807]: E1216 12:46:42.554210 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.554635 kubelet[2807]: E1216 12:46:42.554304 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.554635 kubelet[2807]: W1216 12:46:42.554310 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.554635 kubelet[2807]: E1216 12:46:42.554316 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.554635 kubelet[2807]: E1216 12:46:42.554414 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.554635 kubelet[2807]: W1216 12:46:42.554421 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.554635 kubelet[2807]: E1216 12:46:42.554427 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.554635 kubelet[2807]: E1216 12:46:42.554518 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.554635 kubelet[2807]: W1216 12:46:42.554523 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.554885 kubelet[2807]: E1216 12:46:42.554549 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.554885 kubelet[2807]: E1216 12:46:42.554647 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.554885 kubelet[2807]: W1216 12:46:42.554653 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.554885 kubelet[2807]: E1216 12:46:42.554658 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.579452 kubelet[2807]: E1216 12:46:42.579419 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.579452 kubelet[2807]: W1216 12:46:42.579444 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.579452 kubelet[2807]: E1216 12:46:42.579462 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.579691 kubelet[2807]: E1216 12:46:42.579668 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.579691 kubelet[2807]: W1216 12:46:42.579683 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.579791 kubelet[2807]: E1216 12:46:42.579695 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.579902 kubelet[2807]: E1216 12:46:42.579885 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.579902 kubelet[2807]: W1216 12:46:42.579898 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.579949 kubelet[2807]: E1216 12:46:42.579904 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.580592 kubelet[2807]: E1216 12:46:42.580574 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.580645 kubelet[2807]: W1216 12:46:42.580588 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.580645 kubelet[2807]: E1216 12:46:42.580607 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.581068 kubelet[2807]: E1216 12:46:42.580820 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.581068 kubelet[2807]: W1216 12:46:42.580830 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.581068 kubelet[2807]: E1216 12:46:42.580840 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.581328 kubelet[2807]: E1216 12:46:42.581309 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.581432 kubelet[2807]: W1216 12:46:42.581324 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.581463 kubelet[2807]: E1216 12:46:42.581434 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.582729 kubelet[2807]: E1216 12:46:42.582699 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.582729 kubelet[2807]: W1216 12:46:42.582716 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.582729 kubelet[2807]: E1216 12:46:42.582727 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.582890 kubelet[2807]: E1216 12:46:42.582872 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.582890 kubelet[2807]: W1216 12:46:42.582884 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.582890 kubelet[2807]: E1216 12:46:42.582891 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.583071 kubelet[2807]: E1216 12:46:42.583056 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.583071 kubelet[2807]: W1216 12:46:42.583068 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.583128 kubelet[2807]: E1216 12:46:42.583075 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.583225 kubelet[2807]: E1216 12:46:42.583209 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.583225 kubelet[2807]: W1216 12:46:42.583220 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.583274 kubelet[2807]: E1216 12:46:42.583227 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.590657 kubelet[2807]: E1216 12:46:42.590591 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.590657 kubelet[2807]: W1216 12:46:42.590621 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.590657 kubelet[2807]: E1216 12:46:42.590640 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.591246 kubelet[2807]: E1216 12:46:42.591201 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.591465 kubelet[2807]: W1216 12:46:42.591396 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.591465 kubelet[2807]: E1216 12:46:42.591411 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.592016 kubelet[2807]: E1216 12:46:42.591936 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.592623 kubelet[2807]: W1216 12:46:42.592068 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.592623 kubelet[2807]: E1216 12:46:42.592082 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.592976 kubelet[2807]: E1216 12:46:42.592939 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.592976 kubelet[2807]: W1216 12:46:42.592952 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.592976 kubelet[2807]: E1216 12:46:42.592963 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.593731 kubelet[2807]: E1216 12:46:42.593598 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.593731 kubelet[2807]: W1216 12:46:42.593611 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.593731 kubelet[2807]: E1216 12:46:42.593621 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.594364 kubelet[2807]: E1216 12:46:42.594354 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.594433 kubelet[2807]: W1216 12:46:42.594410 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.594433 kubelet[2807]: E1216 12:46:42.594424 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.596436 kubelet[2807]: E1216 12:46:42.596388 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.596436 kubelet[2807]: W1216 12:46:42.596400 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.596436 kubelet[2807]: E1216 12:46:42.596410 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:42.597063 kubelet[2807]: E1216 12:46:42.597022 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:42.597063 kubelet[2807]: W1216 12:46:42.597032 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:42.597063 kubelet[2807]: E1216 12:46:42.597041 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.525834 kubelet[2807]: I1216 12:46:43.525787 2807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:46:43.561685 kubelet[2807]: E1216 12:46:43.561609 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.561685 kubelet[2807]: W1216 12:46:43.561632 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.561685 kubelet[2807]: E1216 12:46:43.561652 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.561945 kubelet[2807]: E1216 12:46:43.561807 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.561945 kubelet[2807]: W1216 12:46:43.561816 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.561945 kubelet[2807]: E1216 12:46:43.561825 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.562082 kubelet[2807]: E1216 12:46:43.561952 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.562082 kubelet[2807]: W1216 12:46:43.561960 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.562082 kubelet[2807]: E1216 12:46:43.561968 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.562195 kubelet[2807]: E1216 12:46:43.562103 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.562195 kubelet[2807]: W1216 12:46:43.562111 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.562195 kubelet[2807]: E1216 12:46:43.562119 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.562354 kubelet[2807]: E1216 12:46:43.562246 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.562354 kubelet[2807]: W1216 12:46:43.562253 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.562354 kubelet[2807]: E1216 12:46:43.562260 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.562465 kubelet[2807]: E1216 12:46:43.562368 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.562465 kubelet[2807]: W1216 12:46:43.562375 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.562465 kubelet[2807]: E1216 12:46:43.562382 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.562592 kubelet[2807]: E1216 12:46:43.562497 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.562592 kubelet[2807]: W1216 12:46:43.562504 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.562592 kubelet[2807]: E1216 12:46:43.562511 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.562695 kubelet[2807]: E1216 12:46:43.562665 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.562695 kubelet[2807]: W1216 12:46:43.562679 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.562695 kubelet[2807]: E1216 12:46:43.562689 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.562880 kubelet[2807]: E1216 12:46:43.562856 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.562880 kubelet[2807]: W1216 12:46:43.562872 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.562961 kubelet[2807]: E1216 12:46:43.562884 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.563107 kubelet[2807]: E1216 12:46:43.563092 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.563107 kubelet[2807]: W1216 12:46:43.563105 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.563171 kubelet[2807]: E1216 12:46:43.563115 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.563246 kubelet[2807]: E1216 12:46:43.563235 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.563246 kubelet[2807]: W1216 12:46:43.563245 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.563308 kubelet[2807]: E1216 12:46:43.563253 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.563374 kubelet[2807]: E1216 12:46:43.563363 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.563374 kubelet[2807]: W1216 12:46:43.563373 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.563427 kubelet[2807]: E1216 12:46:43.563381 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.563556 kubelet[2807]: E1216 12:46:43.563516 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.563556 kubelet[2807]: W1216 12:46:43.563550 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.563633 kubelet[2807]: E1216 12:46:43.563561 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.564004 kubelet[2807]: E1216 12:46:43.563684 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.564004 kubelet[2807]: W1216 12:46:43.563691 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.564004 kubelet[2807]: E1216 12:46:43.563699 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.564004 kubelet[2807]: E1216 12:46:43.563842 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.564004 kubelet[2807]: W1216 12:46:43.563849 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.564004 kubelet[2807]: E1216 12:46:43.563857 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.587251 kubelet[2807]: E1216 12:46:43.587218 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.587251 kubelet[2807]: W1216 12:46:43.587238 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.587251 kubelet[2807]: E1216 12:46:43.587256 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.587462 kubelet[2807]: E1216 12:46:43.587433 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.587497 kubelet[2807]: W1216 12:46:43.587453 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.587497 kubelet[2807]: E1216 12:46:43.587491 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.587812 kubelet[2807]: E1216 12:46:43.587713 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.587812 kubelet[2807]: W1216 12:46:43.587731 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.587812 kubelet[2807]: E1216 12:46:43.587747 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.588002 kubelet[2807]: E1216 12:46:43.587968 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.588002 kubelet[2807]: W1216 12:46:43.587982 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.588002 kubelet[2807]: E1216 12:46:43.587993 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.588326 kubelet[2807]: E1216 12:46:43.588201 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.588326 kubelet[2807]: W1216 12:46:43.588210 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.588326 kubelet[2807]: E1216 12:46:43.588219 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.588547 kubelet[2807]: E1216 12:46:43.588377 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.588547 kubelet[2807]: W1216 12:46:43.588386 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.588547 kubelet[2807]: E1216 12:46:43.588394 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.588785 kubelet[2807]: E1216 12:46:43.588561 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.588785 kubelet[2807]: W1216 12:46:43.588569 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.588785 kubelet[2807]: E1216 12:46:43.588577 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.588992 kubelet[2807]: E1216 12:46:43.588962 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.588992 kubelet[2807]: W1216 12:46:43.588978 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.588992 kubelet[2807]: E1216 12:46:43.588988 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.589147 kubelet[2807]: E1216 12:46:43.589133 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.589147 kubelet[2807]: W1216 12:46:43.589140 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.589147 kubelet[2807]: E1216 12:46:43.589148 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.589369 kubelet[2807]: E1216 12:46:43.589337 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.589369 kubelet[2807]: W1216 12:46:43.589351 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.589369 kubelet[2807]: E1216 12:46:43.589360 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.589563 kubelet[2807]: E1216 12:46:43.589521 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.589563 kubelet[2807]: W1216 12:46:43.589549 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.589563 kubelet[2807]: E1216 12:46:43.589557 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.589741 kubelet[2807]: E1216 12:46:43.589718 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.589741 kubelet[2807]: W1216 12:46:43.589729 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.589741 kubelet[2807]: E1216 12:46:43.589737 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.589936 kubelet[2807]: E1216 12:46:43.589916 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.589936 kubelet[2807]: W1216 12:46:43.589927 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.589936 kubelet[2807]: E1216 12:46:43.589935 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.591698 kubelet[2807]: E1216 12:46:43.591665 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.591698 kubelet[2807]: W1216 12:46:43.591681 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.591698 kubelet[2807]: E1216 12:46:43.591691 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.591894 kubelet[2807]: E1216 12:46:43.591862 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.591894 kubelet[2807]: W1216 12:46:43.591876 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.591894 kubelet[2807]: E1216 12:46:43.591886 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.592023 kubelet[2807]: E1216 12:46:43.592017 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.592023 kubelet[2807]: W1216 12:46:43.592024 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.592135 kubelet[2807]: E1216 12:46:43.592036 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.592925 kubelet[2807]: E1216 12:46:43.592894 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.592925 kubelet[2807]: W1216 12:46:43.592910 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.592925 kubelet[2807]: E1216 12:46:43.592921 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:43.594428 kubelet[2807]: E1216 12:46:43.594282 2807 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:46:43.594428 kubelet[2807]: W1216 12:46:43.594294 2807 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:46:43.594428 kubelet[2807]: E1216 12:46:43.594304 2807 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:46:44.195839 containerd[1624]: time="2025-12-16T12:46:44.195785883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:44.196967 containerd[1624]: time="2025-12-16T12:46:44.196837725Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Dec 16 12:46:44.197735 containerd[1624]: time="2025-12-16T12:46:44.197703003Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:44.202163 containerd[1624]: time="2025-12-16T12:46:44.202123750Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:44.202564 containerd[1624]: time="2025-12-16T12:46:44.202522989Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.903357482s" Dec 16 12:46:44.202688 containerd[1624]: time="2025-12-16T12:46:44.202629632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 16 12:46:44.242548 containerd[1624]: time="2025-12-16T12:46:44.242477728Z" level=info msg="CreateContainer within sandbox \"ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 12:46:44.249773 containerd[1624]: time="2025-12-16T12:46:44.249721148Z" level=info msg="Container 2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:46:44.258589 containerd[1624]: time="2025-12-16T12:46:44.258553443Z" level=info msg="CreateContainer within sandbox \"ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18\"" Dec 16 12:46:44.259573 containerd[1624]: time="2025-12-16T12:46:44.259184645Z" level=info msg="StartContainer for \"2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18\"" Dec 16 12:46:44.269032 containerd[1624]: time="2025-12-16T12:46:44.268982789Z" level=info msg="connecting to shim 2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18" address="unix:///run/containerd/s/f7abee2d438b28864fc602dda003b25158d9b4ac834efe312088c676cae39fb5" protocol=ttrpc version=3 Dec 16 12:46:44.291704 systemd[1]: Started cri-containerd-2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18.scope - libcontainer container 2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18. Dec 16 12:46:44.337000 audit: BPF prog-id=166 op=LOAD Dec 16 12:46:44.340263 kernel: kauditd_printk_skb: 62 callbacks suppressed Dec 16 12:46:44.340347 kernel: audit: type=1334 audit(1765889204.337:557): prog-id=166 op=LOAD Dec 16 12:46:44.337000 audit[3524]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3339 pid=3524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:44.343949 kernel: audit: type=1300 audit(1765889204.337:557): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3339 pid=3524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:44.337000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265653735343265623432343262323931643834353361366331653236 Dec 16 12:46:44.351562 kernel: audit: type=1327 audit(1765889204.337:557): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265653735343265623432343262323931643834353361366331653236 Dec 16 12:46:44.356935 kernel: audit: type=1334 audit(1765889204.338:558): prog-id=167 op=LOAD Dec 16 12:46:44.338000 audit: BPF prog-id=167 op=LOAD Dec 16 12:46:44.363613 kernel: audit: type=1300 audit(1765889204.338:558): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3339 pid=3524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:44.338000 audit[3524]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3339 pid=3524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:44.338000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265653735343265623432343262323931643834353361366331653236 Dec 16 12:46:44.338000 audit: BPF prog-id=167 op=UNLOAD Dec 16 12:46:44.374355 kernel: audit: type=1327 audit(1765889204.338:558): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265653735343265623432343262323931643834353361366331653236 Dec 16 12:46:44.374399 kernel: audit: type=1334 audit(1765889204.338:559): prog-id=167 op=UNLOAD Dec 16 12:46:44.374418 kernel: audit: type=1300 audit(1765889204.338:559): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:44.338000 audit[3524]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:44.382289 kernel: audit: type=1327 audit(1765889204.338:559): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265653735343265623432343262323931643834353361366331653236 Dec 16 12:46:44.338000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265653735343265623432343262323931643834353361366331653236 Dec 16 12:46:44.386421 containerd[1624]: time="2025-12-16T12:46:44.386380278Z" level=info msg="StartContainer for \"2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18\" returns successfully" Dec 16 12:46:44.387606 kernel: audit: type=1334 audit(1765889204.338:560): prog-id=166 op=UNLOAD Dec 16 12:46:44.338000 audit: BPF prog-id=166 op=UNLOAD Dec 16 12:46:44.338000 audit[3524]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:44.338000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265653735343265623432343262323931643834353361366331653236 Dec 16 12:46:44.338000 audit: BPF prog-id=168 op=LOAD Dec 16 12:46:44.338000 audit[3524]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3339 pid=3524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:44.338000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265653735343265623432343262323931643834353361366331653236 Dec 16 12:46:44.398759 systemd[1]: cri-containerd-2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18.scope: Deactivated successfully. Dec 16 12:46:44.401000 audit: BPF prog-id=168 op=UNLOAD Dec 16 12:46:44.412951 kubelet[2807]: E1216 12:46:44.412625 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:46:44.416377 containerd[1624]: time="2025-12-16T12:46:44.416347480Z" level=info msg="received container exit event container_id:\"2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18\" id:\"2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18\" pid:3537 exited_at:{seconds:1765889204 nanos:405433552}" Dec 16 12:46:44.440988 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2ee7542eb4242b291d8453a6c1e26039d935ecb1a2a776804cf40f881372ec18-rootfs.mount: Deactivated successfully. Dec 16 12:46:44.530837 containerd[1624]: time="2025-12-16T12:46:44.530149293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 12:46:44.559038 kubelet[2807]: I1216 12:46:44.557465 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5547cc86fd-bfvwn" podStartSLOduration=3.22804587 podStartE2EDuration="6.557428739s" podCreationTimestamp="2025-12-16 12:46:38 +0000 UTC" firstStartedPulling="2025-12-16 12:46:38.969392144 +0000 UTC m=+22.719651137" lastFinishedPulling="2025-12-16 12:46:42.298775014 +0000 UTC m=+26.049034006" observedRunningTime="2025-12-16 12:46:42.546582762 +0000 UTC m=+26.296841755" watchObservedRunningTime="2025-12-16 12:46:44.557428739 +0000 UTC m=+28.307687722" Dec 16 12:46:46.413577 kubelet[2807]: E1216 12:46:46.412480 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:46:48.415122 kubelet[2807]: E1216 12:46:48.415026 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:46:49.982288 containerd[1624]: time="2025-12-16T12:46:49.982213941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:49.983456 containerd[1624]: time="2025-12-16T12:46:49.983422294Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Dec 16 12:46:49.984824 containerd[1624]: time="2025-12-16T12:46:49.984784950Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:49.986742 containerd[1624]: time="2025-12-16T12:46:49.986705637Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:49.987324 containerd[1624]: time="2025-12-16T12:46:49.987280710Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 5.457099084s" Dec 16 12:46:49.987324 containerd[1624]: time="2025-12-16T12:46:49.987312749Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 16 12:46:50.014657 containerd[1624]: time="2025-12-16T12:46:50.014613925Z" level=info msg="CreateContainer within sandbox \"ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 12:46:50.025975 containerd[1624]: time="2025-12-16T12:46:50.024751761Z" level=info msg="Container 332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:46:50.029304 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2072000537.mount: Deactivated successfully. Dec 16 12:46:50.039066 containerd[1624]: time="2025-12-16T12:46:50.039034853Z" level=info msg="CreateContainer within sandbox \"ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28\"" Dec 16 12:46:50.039520 containerd[1624]: time="2025-12-16T12:46:50.039385589Z" level=info msg="StartContainer for \"332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28\"" Dec 16 12:46:50.047405 containerd[1624]: time="2025-12-16T12:46:50.047033901Z" level=info msg="connecting to shim 332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28" address="unix:///run/containerd/s/f7abee2d438b28864fc602dda003b25158d9b4ac834efe312088c676cae39fb5" protocol=ttrpc version=3 Dec 16 12:46:50.067716 systemd[1]: Started cri-containerd-332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28.scope - libcontainer container 332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28. Dec 16 12:46:50.125000 audit: BPF prog-id=169 op=LOAD Dec 16 12:46:50.126707 kernel: kauditd_printk_skb: 6 callbacks suppressed Dec 16 12:46:50.126761 kernel: audit: type=1334 audit(1765889210.125:563): prog-id=169 op=LOAD Dec 16 12:46:50.125000 audit[3579]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3339 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:50.132874 kernel: audit: type=1300 audit(1765889210.125:563): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3339 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:50.125000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333326662636339386439383133366266393863646362383838313435 Dec 16 12:46:50.142014 kernel: audit: type=1327 audit(1765889210.125:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333326662636339386439383133366266393863646362383838313435 Dec 16 12:46:50.127000 audit: BPF prog-id=170 op=LOAD Dec 16 12:46:50.148701 kernel: audit: type=1334 audit(1765889210.127:564): prog-id=170 op=LOAD Dec 16 12:46:50.127000 audit[3579]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3339 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:50.153548 kernel: audit: type=1300 audit(1765889210.127:564): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3339 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:50.127000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333326662636339386439383133366266393863646362383838313435 Dec 16 12:46:50.162043 kernel: audit: type=1327 audit(1765889210.127:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333326662636339386439383133366266393863646362383838313435 Dec 16 12:46:50.127000 audit: BPF prog-id=170 op=UNLOAD Dec 16 12:46:50.168573 kernel: audit: type=1334 audit(1765889210.127:565): prog-id=170 op=UNLOAD Dec 16 12:46:50.174091 kernel: audit: type=1300 audit(1765889210.127:565): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:50.127000 audit[3579]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:50.127000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333326662636339386439383133366266393863646362383838313435 Dec 16 12:46:50.181257 kernel: audit: type=1327 audit(1765889210.127:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333326662636339386439383133366266393863646362383838313435 Dec 16 12:46:50.127000 audit: BPF prog-id=169 op=UNLOAD Dec 16 12:46:50.186621 kernel: audit: type=1334 audit(1765889210.127:566): prog-id=169 op=UNLOAD Dec 16 12:46:50.127000 audit[3579]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:50.127000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333326662636339386439383133366266393863646362383838313435 Dec 16 12:46:50.127000 audit: BPF prog-id=171 op=LOAD Dec 16 12:46:50.127000 audit[3579]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3339 pid=3579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:46:50.127000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333326662636339386439383133366266393863646362383838313435 Dec 16 12:46:50.196267 containerd[1624]: time="2025-12-16T12:46:50.196074271Z" level=info msg="StartContainer for \"332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28\" returns successfully" Dec 16 12:46:50.412999 kubelet[2807]: E1216 12:46:50.412708 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:46:50.557296 systemd[1]: cri-containerd-332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28.scope: Deactivated successfully. Dec 16 12:46:50.558634 systemd[1]: cri-containerd-332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28.scope: Consumed 391ms CPU time, 163M memory peak, 8.2M read from disk, 171.3M written to disk. Dec 16 12:46:50.560000 audit: BPF prog-id=171 op=UNLOAD Dec 16 12:46:50.561880 containerd[1624]: time="2025-12-16T12:46:50.561856554Z" level=info msg="received container exit event container_id:\"332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28\" id:\"332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28\" pid:3592 exited_at:{seconds:1765889210 nanos:559468662}" Dec 16 12:46:50.641276 kubelet[2807]: I1216 12:46:50.641241 2807 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 16 12:46:50.701428 systemd[1]: Created slice kubepods-besteffort-podae41bde4_2571_4fb7_adb8_a8808483af15.slice - libcontainer container kubepods-besteffort-podae41bde4_2571_4fb7_adb8_a8808483af15.slice. Dec 16 12:46:50.726918 systemd[1]: Created slice kubepods-burstable-pod2ce9c252_c98d_4be2_ac79_61841133c167.slice - libcontainer container kubepods-burstable-pod2ce9c252_c98d_4be2_ac79_61841133c167.slice. Dec 16 12:46:50.737429 kubelet[2807]: I1216 12:46:50.737092 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1081c7e3-0645-4858-ac62-90f635bb19a9-calico-apiserver-certs\") pod \"calico-apiserver-9f54fc4f9-7wdcc\" (UID: \"1081c7e3-0645-4858-ac62-90f635bb19a9\") " pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" Dec 16 12:46:50.737429 kubelet[2807]: I1216 12:46:50.737118 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgn68\" (UniqueName: \"kubernetes.io/projected/2ce9c252-c98d-4be2-ac79-61841133c167-kube-api-access-kgn68\") pod \"coredns-66bc5c9577-p72td\" (UID: \"2ce9c252-c98d-4be2-ac79-61841133c167\") " pod="kube-system/coredns-66bc5c9577-p72td" Dec 16 12:46:50.737429 kubelet[2807]: I1216 12:46:50.737135 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7b3fffe8-2720-473c-b01e-f0da86506b4f-whisker-backend-key-pair\") pod \"whisker-65685c564b-hpvbh\" (UID: \"7b3fffe8-2720-473c-b01e-f0da86506b4f\") " pod="calico-system/whisker-65685c564b-hpvbh" Dec 16 12:46:50.737429 kubelet[2807]: I1216 12:46:50.737148 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ce9c252-c98d-4be2-ac79-61841133c167-config-volume\") pod \"coredns-66bc5c9577-p72td\" (UID: \"2ce9c252-c98d-4be2-ac79-61841133c167\") " pod="kube-system/coredns-66bc5c9577-p72td" Dec 16 12:46:50.737429 kubelet[2807]: I1216 12:46:50.737161 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae41bde4-2571-4fb7-adb8-a8808483af15-tigera-ca-bundle\") pod \"calico-kube-controllers-6df777597b-rfdvx\" (UID: \"ae41bde4-2571-4fb7-adb8-a8808483af15\") " pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" Dec 16 12:46:50.738490 kubelet[2807]: I1216 12:46:50.737175 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtmhd\" (UniqueName: \"kubernetes.io/projected/7b3fffe8-2720-473c-b01e-f0da86506b4f-kube-api-access-qtmhd\") pod \"whisker-65685c564b-hpvbh\" (UID: \"7b3fffe8-2720-473c-b01e-f0da86506b4f\") " pod="calico-system/whisker-65685c564b-hpvbh" Dec 16 12:46:50.738490 kubelet[2807]: I1216 12:46:50.737189 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b3fffe8-2720-473c-b01e-f0da86506b4f-whisker-ca-bundle\") pod \"whisker-65685c564b-hpvbh\" (UID: \"7b3fffe8-2720-473c-b01e-f0da86506b4f\") " pod="calico-system/whisker-65685c564b-hpvbh" Dec 16 12:46:50.738490 kubelet[2807]: I1216 12:46:50.737204 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2f29\" (UniqueName: \"kubernetes.io/projected/1081c7e3-0645-4858-ac62-90f635bb19a9-kube-api-access-x2f29\") pod \"calico-apiserver-9f54fc4f9-7wdcc\" (UID: \"1081c7e3-0645-4858-ac62-90f635bb19a9\") " pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" Dec 16 12:46:50.738490 kubelet[2807]: I1216 12:46:50.737217 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhbk7\" (UniqueName: \"kubernetes.io/projected/ae41bde4-2571-4fb7-adb8-a8808483af15-kube-api-access-jhbk7\") pod \"calico-kube-controllers-6df777597b-rfdvx\" (UID: \"ae41bde4-2571-4fb7-adb8-a8808483af15\") " pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" Dec 16 12:46:50.740217 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-332fbcc98d98136bf98cdcb8881459a06125bf53993afeaf478a092fe30c1b28-rootfs.mount: Deactivated successfully. Dec 16 12:46:50.749747 systemd[1]: Created slice kubepods-besteffort-pod7b3fffe8_2720_473c_b01e_f0da86506b4f.slice - libcontainer container kubepods-besteffort-pod7b3fffe8_2720_473c_b01e_f0da86506b4f.slice. Dec 16 12:46:50.755691 systemd[1]: Created slice kubepods-besteffort-pod1081c7e3_0645_4858_ac62_90f635bb19a9.slice - libcontainer container kubepods-besteffort-pod1081c7e3_0645_4858_ac62_90f635bb19a9.slice. Dec 16 12:46:50.764896 systemd[1]: Created slice kubepods-burstable-pod8265dde8_f635_41ae_8628_19a7296dfdf0.slice - libcontainer container kubepods-burstable-pod8265dde8_f635_41ae_8628_19a7296dfdf0.slice. Dec 16 12:46:50.770586 systemd[1]: Created slice kubepods-besteffort-pod50e78f30_18dc_4823_b78b_3500e19d4f6f.slice - libcontainer container kubepods-besteffort-pod50e78f30_18dc_4823_b78b_3500e19d4f6f.slice. Dec 16 12:46:50.777786 systemd[1]: Created slice kubepods-besteffort-poddb9e60f0_9cc3_4000_b32c_9ba313d4b676.slice - libcontainer container kubepods-besteffort-poddb9e60f0_9cc3_4000_b32c_9ba313d4b676.slice. Dec 16 12:46:50.837823 kubelet[2807]: I1216 12:46:50.837794 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwp5v\" (UniqueName: \"kubernetes.io/projected/8265dde8-f635-41ae-8628-19a7296dfdf0-kube-api-access-wwp5v\") pod \"coredns-66bc5c9577-bqsxl\" (UID: \"8265dde8-f635-41ae-8628-19a7296dfdf0\") " pod="kube-system/coredns-66bc5c9577-bqsxl" Dec 16 12:46:50.838683 kubelet[2807]: I1216 12:46:50.838060 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db9e60f0-9cc3-4000-b32c-9ba313d4b676-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-sdw6q\" (UID: \"db9e60f0-9cc3-4000-b32c-9ba313d4b676\") " pod="calico-system/goldmane-7c778bb748-sdw6q" Dec 16 12:46:50.838683 kubelet[2807]: I1216 12:46:50.838079 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8265dde8-f635-41ae-8628-19a7296dfdf0-config-volume\") pod \"coredns-66bc5c9577-bqsxl\" (UID: \"8265dde8-f635-41ae-8628-19a7296dfdf0\") " pod="kube-system/coredns-66bc5c9577-bqsxl" Dec 16 12:46:50.839047 kubelet[2807]: I1216 12:46:50.839004 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgmf2\" (UniqueName: \"kubernetes.io/projected/db9e60f0-9cc3-4000-b32c-9ba313d4b676-kube-api-access-qgmf2\") pod \"goldmane-7c778bb748-sdw6q\" (UID: \"db9e60f0-9cc3-4000-b32c-9ba313d4b676\") " pod="calico-system/goldmane-7c778bb748-sdw6q" Dec 16 12:46:50.840127 kubelet[2807]: I1216 12:46:50.839642 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mb5g\" (UniqueName: \"kubernetes.io/projected/50e78f30-18dc-4823-b78b-3500e19d4f6f-kube-api-access-8mb5g\") pod \"calico-apiserver-9f54fc4f9-v6lkx\" (UID: \"50e78f30-18dc-4823-b78b-3500e19d4f6f\") " pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" Dec 16 12:46:50.840421 kubelet[2807]: I1216 12:46:50.840406 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/db9e60f0-9cc3-4000-b32c-9ba313d4b676-goldmane-key-pair\") pod \"goldmane-7c778bb748-sdw6q\" (UID: \"db9e60f0-9cc3-4000-b32c-9ba313d4b676\") " pod="calico-system/goldmane-7c778bb748-sdw6q" Dec 16 12:46:50.840511 kubelet[2807]: I1216 12:46:50.840501 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/50e78f30-18dc-4823-b78b-3500e19d4f6f-calico-apiserver-certs\") pod \"calico-apiserver-9f54fc4f9-v6lkx\" (UID: \"50e78f30-18dc-4823-b78b-3500e19d4f6f\") " pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" Dec 16 12:46:50.840613 kubelet[2807]: I1216 12:46:50.840602 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db9e60f0-9cc3-4000-b32c-9ba313d4b676-config\") pod \"goldmane-7c778bb748-sdw6q\" (UID: \"db9e60f0-9cc3-4000-b32c-9ba313d4b676\") " pod="calico-system/goldmane-7c778bb748-sdw6q" Dec 16 12:46:51.021441 containerd[1624]: time="2025-12-16T12:46:51.021273237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df777597b-rfdvx,Uid:ae41bde4-2571-4fb7-adb8-a8808483af15,Namespace:calico-system,Attempt:0,}" Dec 16 12:46:51.037131 containerd[1624]: time="2025-12-16T12:46:51.037063416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-p72td,Uid:2ce9c252-c98d-4be2-ac79-61841133c167,Namespace:kube-system,Attempt:0,}" Dec 16 12:46:51.058366 containerd[1624]: time="2025-12-16T12:46:51.058321152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65685c564b-hpvbh,Uid:7b3fffe8-2720-473c-b01e-f0da86506b4f,Namespace:calico-system,Attempt:0,}" Dec 16 12:46:51.068870 containerd[1624]: time="2025-12-16T12:46:51.068779850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f54fc4f9-7wdcc,Uid:1081c7e3-0645-4858-ac62-90f635bb19a9,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:46:51.074490 containerd[1624]: time="2025-12-16T12:46:51.074443581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-bqsxl,Uid:8265dde8-f635-41ae-8628-19a7296dfdf0,Namespace:kube-system,Attempt:0,}" Dec 16 12:46:51.096437 containerd[1624]: time="2025-12-16T12:46:51.096299534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sdw6q,Uid:db9e60f0-9cc3-4000-b32c-9ba313d4b676,Namespace:calico-system,Attempt:0,}" Dec 16 12:46:51.109631 containerd[1624]: time="2025-12-16T12:46:51.109600681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f54fc4f9-v6lkx,Uid:50e78f30-18dc-4823-b78b-3500e19d4f6f,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:46:51.286844 containerd[1624]: time="2025-12-16T12:46:51.286717797Z" level=error msg="Failed to destroy network for sandbox \"1d670342e686db1034a09dc75df3bd082262bda740d68a565b480f36e9210ff2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.287678 containerd[1624]: time="2025-12-16T12:46:51.287612914Z" level=error msg="Failed to destroy network for sandbox \"842c91873c843bd83d33ff08cd8c855852ff7a44445f139b741d3440b31ee821\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.291543 containerd[1624]: time="2025-12-16T12:46:51.291500818Z" level=error msg="Failed to destroy network for sandbox \"5a33aaa4bddf0932a8d21a9aafe25b21d00463ac77f21bc7e6cdce6ae7c84197\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.292419 containerd[1624]: time="2025-12-16T12:46:51.292331093Z" level=error msg="Failed to destroy network for sandbox \"510b5708c745f62fbcb91a05baceeec4161e4e5c8c810507a8db34594788b778\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.293321 containerd[1624]: time="2025-12-16T12:46:51.293287757Z" level=error msg="Failed to destroy network for sandbox \"ba51d897fac8cfdc81c633928eda8f9bdbc6b5d99d0ec88c3bbdaf6cea2a38e1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.293812 containerd[1624]: time="2025-12-16T12:46:51.293714397Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-p72td,Uid:2ce9c252-c98d-4be2-ac79-61841133c167,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d670342e686db1034a09dc75df3bd082262bda740d68a565b480f36e9210ff2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.301965 kubelet[2807]: E1216 12:46:51.301557 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d670342e686db1034a09dc75df3bd082262bda740d68a565b480f36e9210ff2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.301965 kubelet[2807]: E1216 12:46:51.301627 2807 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d670342e686db1034a09dc75df3bd082262bda740d68a565b480f36e9210ff2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-p72td" Dec 16 12:46:51.301965 kubelet[2807]: E1216 12:46:51.301645 2807 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d670342e686db1034a09dc75df3bd082262bda740d68a565b480f36e9210ff2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-p72td" Dec 16 12:46:51.302719 kubelet[2807]: E1216 12:46:51.301692 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-p72td_kube-system(2ce9c252-c98d-4be2-ac79-61841133c167)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-p72td_kube-system(2ce9c252-c98d-4be2-ac79-61841133c167)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1d670342e686db1034a09dc75df3bd082262bda740d68a565b480f36e9210ff2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-p72td" podUID="2ce9c252-c98d-4be2-ac79-61841133c167" Dec 16 12:46:51.304313 containerd[1624]: time="2025-12-16T12:46:51.302304079Z" level=error msg="Failed to destroy network for sandbox \"17e0e5159c7f149e64a8fdcd808352581a1125ce70587c61ccfa7b8635330e76\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.305145 containerd[1624]: time="2025-12-16T12:46:51.305123205Z" level=error msg="Failed to destroy network for sandbox \"20ba1cb4db2b55cb2410c2ba236b46dd43886a63ff67b57f71a33198a5a15f81\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.310447 containerd[1624]: time="2025-12-16T12:46:51.310393931Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65685c564b-hpvbh,Uid:7b3fffe8-2720-473c-b01e-f0da86506b4f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"842c91873c843bd83d33ff08cd8c855852ff7a44445f139b741d3440b31ee821\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.310709 kubelet[2807]: E1216 12:46:51.310675 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"842c91873c843bd83d33ff08cd8c855852ff7a44445f139b741d3440b31ee821\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.310954 kubelet[2807]: E1216 12:46:51.310720 2807 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"842c91873c843bd83d33ff08cd8c855852ff7a44445f139b741d3440b31ee821\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65685c564b-hpvbh" Dec 16 12:46:51.310954 kubelet[2807]: E1216 12:46:51.310736 2807 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"842c91873c843bd83d33ff08cd8c855852ff7a44445f139b741d3440b31ee821\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65685c564b-hpvbh" Dec 16 12:46:51.310954 kubelet[2807]: E1216 12:46:51.310779 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-65685c564b-hpvbh_calico-system(7b3fffe8-2720-473c-b01e-f0da86506b4f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-65685c564b-hpvbh_calico-system(7b3fffe8-2720-473c-b01e-f0da86506b4f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"842c91873c843bd83d33ff08cd8c855852ff7a44445f139b741d3440b31ee821\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-65685c564b-hpvbh" podUID="7b3fffe8-2720-473c-b01e-f0da86506b4f" Dec 16 12:46:51.312428 containerd[1624]: time="2025-12-16T12:46:51.312391331Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-bqsxl,Uid:8265dde8-f635-41ae-8628-19a7296dfdf0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a33aaa4bddf0932a8d21a9aafe25b21d00463ac77f21bc7e6cdce6ae7c84197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.312776 kubelet[2807]: E1216 12:46:51.312632 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a33aaa4bddf0932a8d21a9aafe25b21d00463ac77f21bc7e6cdce6ae7c84197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.312776 kubelet[2807]: E1216 12:46:51.312681 2807 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a33aaa4bddf0932a8d21a9aafe25b21d00463ac77f21bc7e6cdce6ae7c84197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-bqsxl" Dec 16 12:46:51.312776 kubelet[2807]: E1216 12:46:51.312696 2807 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a33aaa4bddf0932a8d21a9aafe25b21d00463ac77f21bc7e6cdce6ae7c84197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-bqsxl" Dec 16 12:46:51.312932 kubelet[2807]: E1216 12:46:51.312739 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-bqsxl_kube-system(8265dde8-f635-41ae-8628-19a7296dfdf0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-bqsxl_kube-system(8265dde8-f635-41ae-8628-19a7296dfdf0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a33aaa4bddf0932a8d21a9aafe25b21d00463ac77f21bc7e6cdce6ae7c84197\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-bqsxl" podUID="8265dde8-f635-41ae-8628-19a7296dfdf0" Dec 16 12:46:51.314096 containerd[1624]: time="2025-12-16T12:46:51.314063572Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df777597b-rfdvx,Uid:ae41bde4-2571-4fb7-adb8-a8808483af15,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"510b5708c745f62fbcb91a05baceeec4161e4e5c8c810507a8db34594788b778\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.314580 kubelet[2807]: E1216 12:46:51.314436 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"510b5708c745f62fbcb91a05baceeec4161e4e5c8c810507a8db34594788b778\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.314940 kubelet[2807]: E1216 12:46:51.314666 2807 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"510b5708c745f62fbcb91a05baceeec4161e4e5c8c810507a8db34594788b778\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" Dec 16 12:46:51.314940 kubelet[2807]: E1216 12:46:51.314714 2807 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"510b5708c745f62fbcb91a05baceeec4161e4e5c8c810507a8db34594788b778\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" Dec 16 12:46:51.314940 kubelet[2807]: E1216 12:46:51.314831 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6df777597b-rfdvx_calico-system(ae41bde4-2571-4fb7-adb8-a8808483af15)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6df777597b-rfdvx_calico-system(ae41bde4-2571-4fb7-adb8-a8808483af15)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"510b5708c745f62fbcb91a05baceeec4161e4e5c8c810507a8db34594788b778\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:46:51.315765 containerd[1624]: time="2025-12-16T12:46:51.315709945Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f54fc4f9-7wdcc,Uid:1081c7e3-0645-4858-ac62-90f635bb19a9,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba51d897fac8cfdc81c633928eda8f9bdbc6b5d99d0ec88c3bbdaf6cea2a38e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.316029 kubelet[2807]: E1216 12:46:51.316005 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba51d897fac8cfdc81c633928eda8f9bdbc6b5d99d0ec88c3bbdaf6cea2a38e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.316473 kubelet[2807]: E1216 12:46:51.316145 2807 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba51d897fac8cfdc81c633928eda8f9bdbc6b5d99d0ec88c3bbdaf6cea2a38e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" Dec 16 12:46:51.316473 kubelet[2807]: E1216 12:46:51.316171 2807 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba51d897fac8cfdc81c633928eda8f9bdbc6b5d99d0ec88c3bbdaf6cea2a38e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" Dec 16 12:46:51.316473 kubelet[2807]: E1216 12:46:51.316207 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9f54fc4f9-7wdcc_calico-apiserver(1081c7e3-0645-4858-ac62-90f635bb19a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9f54fc4f9-7wdcc_calico-apiserver(1081c7e3-0645-4858-ac62-90f635bb19a9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba51d897fac8cfdc81c633928eda8f9bdbc6b5d99d0ec88c3bbdaf6cea2a38e1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:46:51.316942 containerd[1624]: time="2025-12-16T12:46:51.316886186Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sdw6q,Uid:db9e60f0-9cc3-4000-b32c-9ba313d4b676,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"17e0e5159c7f149e64a8fdcd808352581a1125ce70587c61ccfa7b8635330e76\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.317167 kubelet[2807]: E1216 12:46:51.317132 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17e0e5159c7f149e64a8fdcd808352581a1125ce70587c61ccfa7b8635330e76\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.317459 kubelet[2807]: E1216 12:46:51.317378 2807 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17e0e5159c7f149e64a8fdcd808352581a1125ce70587c61ccfa7b8635330e76\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-sdw6q" Dec 16 12:46:51.317459 kubelet[2807]: E1216 12:46:51.317400 2807 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17e0e5159c7f149e64a8fdcd808352581a1125ce70587c61ccfa7b8635330e76\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-sdw6q" Dec 16 12:46:51.317459 kubelet[2807]: E1216 12:46:51.317444 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-sdw6q_calico-system(db9e60f0-9cc3-4000-b32c-9ba313d4b676)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-sdw6q_calico-system(db9e60f0-9cc3-4000-b32c-9ba313d4b676)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"17e0e5159c7f149e64a8fdcd808352581a1125ce70587c61ccfa7b8635330e76\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:46:51.317852 containerd[1624]: time="2025-12-16T12:46:51.317802243Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f54fc4f9-v6lkx,Uid:50e78f30-18dc-4823-b78b-3500e19d4f6f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"20ba1cb4db2b55cb2410c2ba236b46dd43886a63ff67b57f71a33198a5a15f81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.318614 kubelet[2807]: E1216 12:46:51.318589 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20ba1cb4db2b55cb2410c2ba236b46dd43886a63ff67b57f71a33198a5a15f81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:51.318666 kubelet[2807]: E1216 12:46:51.318618 2807 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20ba1cb4db2b55cb2410c2ba236b46dd43886a63ff67b57f71a33198a5a15f81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" Dec 16 12:46:51.318666 kubelet[2807]: E1216 12:46:51.318631 2807 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20ba1cb4db2b55cb2410c2ba236b46dd43886a63ff67b57f71a33198a5a15f81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" Dec 16 12:46:51.318726 kubelet[2807]: E1216 12:46:51.318684 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9f54fc4f9-v6lkx_calico-apiserver(50e78f30-18dc-4823-b78b-3500e19d4f6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9f54fc4f9-v6lkx_calico-apiserver(50e78f30-18dc-4823-b78b-3500e19d4f6f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"20ba1cb4db2b55cb2410c2ba236b46dd43886a63ff67b57f71a33198a5a15f81\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:46:51.556171 containerd[1624]: time="2025-12-16T12:46:51.556136795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 12:46:52.026254 systemd[1]: run-netns-cni\x2d0bf843f3\x2db166\x2da246\x2da4f5\x2d70d9044fe64f.mount: Deactivated successfully. Dec 16 12:46:52.026868 systemd[1]: run-netns-cni\x2df5d371d3\x2db403\x2d6aec\x2dedd3\x2d3f8c6abb54ff.mount: Deactivated successfully. Dec 16 12:46:52.026987 systemd[1]: run-netns-cni\x2d65c26f95\x2d3073\x2d886d\x2dce7e\x2d846fc250dde0.mount: Deactivated successfully. Dec 16 12:46:52.027079 systemd[1]: run-netns-cni\x2d05a5d10c\x2ddc8e\x2d2615\x2d8097\x2d6356b96294a0.mount: Deactivated successfully. Dec 16 12:46:52.422751 systemd[1]: Created slice kubepods-besteffort-podeff3d741_729e_4ed9_a6a3_d314f99d7c29.slice - libcontainer container kubepods-besteffort-podeff3d741_729e_4ed9_a6a3_d314f99d7c29.slice. Dec 16 12:46:52.426749 containerd[1624]: time="2025-12-16T12:46:52.426712028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h22qr,Uid:eff3d741-729e-4ed9-a6a3-d314f99d7c29,Namespace:calico-system,Attempt:0,}" Dec 16 12:46:52.476718 containerd[1624]: time="2025-12-16T12:46:52.476667475Z" level=error msg="Failed to destroy network for sandbox \"7f0f71c6ae8e65d3fa979057908af0e8bb2bb7a345e6e0f4e3e877dd58b860e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:52.478700 systemd[1]: run-netns-cni\x2d352464d8\x2d136a\x2dd455\x2d29c5\x2da58bbe5679d8.mount: Deactivated successfully. Dec 16 12:46:52.479828 containerd[1624]: time="2025-12-16T12:46:52.479290338Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h22qr,Uid:eff3d741-729e-4ed9-a6a3-d314f99d7c29,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f0f71c6ae8e65d3fa979057908af0e8bb2bb7a345e6e0f4e3e877dd58b860e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:52.480114 kubelet[2807]: E1216 12:46:52.480076 2807 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f0f71c6ae8e65d3fa979057908af0e8bb2bb7a345e6e0f4e3e877dd58b860e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:46:52.480356 kubelet[2807]: E1216 12:46:52.480173 2807 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f0f71c6ae8e65d3fa979057908af0e8bb2bb7a345e6e0f4e3e877dd58b860e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h22qr" Dec 16 12:46:52.480356 kubelet[2807]: E1216 12:46:52.480196 2807 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f0f71c6ae8e65d3fa979057908af0e8bb2bb7a345e6e0f4e3e877dd58b860e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h22qr" Dec 16 12:46:52.480356 kubelet[2807]: E1216 12:46:52.480270 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7f0f71c6ae8e65d3fa979057908af0e8bb2bb7a345e6e0f4e3e877dd58b860e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:46:59.825122 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3983706016.mount: Deactivated successfully. Dec 16 12:46:59.915099 containerd[1624]: time="2025-12-16T12:46:59.893691806Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Dec 16 12:46:59.916270 containerd[1624]: time="2025-12-16T12:46:59.915801538Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:59.968756 containerd[1624]: time="2025-12-16T12:46:59.968713873Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:59.974805 containerd[1624]: time="2025-12-16T12:46:59.974755341Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:46:59.975354 containerd[1624]: time="2025-12-16T12:46:59.975129188Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 8.41874872s" Dec 16 12:46:59.975354 containerd[1624]: time="2025-12-16T12:46:59.975157411Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 16 12:47:00.013489 containerd[1624]: time="2025-12-16T12:47:00.013448883Z" level=info msg="CreateContainer within sandbox \"ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 12:47:00.087177 containerd[1624]: time="2025-12-16T12:47:00.086946662Z" level=info msg="Container 5422febd845da900116103be2bbd0ad6f577654cb9f5daa3083af9a7d07a831e: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:47:00.090077 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2019877828.mount: Deactivated successfully. Dec 16 12:47:00.134973 containerd[1624]: time="2025-12-16T12:47:00.134930362Z" level=info msg="CreateContainer within sandbox \"ba5a7d40de3781d228848ba6221a20f3569d650d730dcac7bd64d04646b40e33\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5422febd845da900116103be2bbd0ad6f577654cb9f5daa3083af9a7d07a831e\"" Dec 16 12:47:00.135590 containerd[1624]: time="2025-12-16T12:47:00.135567527Z" level=info msg="StartContainer for \"5422febd845da900116103be2bbd0ad6f577654cb9f5daa3083af9a7d07a831e\"" Dec 16 12:47:00.139539 containerd[1624]: time="2025-12-16T12:47:00.139497087Z" level=info msg="connecting to shim 5422febd845da900116103be2bbd0ad6f577654cb9f5daa3083af9a7d07a831e" address="unix:///run/containerd/s/f7abee2d438b28864fc602dda003b25158d9b4ac834efe312088c676cae39fb5" protocol=ttrpc version=3 Dec 16 12:47:00.255727 systemd[1]: Started cri-containerd-5422febd845da900116103be2bbd0ad6f577654cb9f5daa3083af9a7d07a831e.scope - libcontainer container 5422febd845da900116103be2bbd0ad6f577654cb9f5daa3083af9a7d07a831e. Dec 16 12:47:00.301567 kernel: kauditd_printk_skb: 6 callbacks suppressed Dec 16 12:47:00.302273 kernel: audit: type=1334 audit(1765889220.299:569): prog-id=172 op=LOAD Dec 16 12:47:00.299000 audit: BPF prog-id=172 op=LOAD Dec 16 12:47:00.312591 kernel: audit: type=1300 audit(1765889220.299:569): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000172488 a2=98 a3=0 items=0 ppid=3339 pid=3853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:00.299000 audit[3853]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000172488 a2=98 a3=0 items=0 ppid=3339 pid=3853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:00.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534323266656264383435646139303031313631303362653262626430 Dec 16 12:47:00.316145 kernel: audit: type=1327 audit(1765889220.299:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534323266656264383435646139303031313631303362653262626430 Dec 16 12:47:00.299000 audit: BPF prog-id=173 op=LOAD Dec 16 12:47:00.322748 kernel: audit: type=1334 audit(1765889220.299:570): prog-id=173 op=LOAD Dec 16 12:47:00.299000 audit[3853]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000172218 a2=98 a3=0 items=0 ppid=3339 pid=3853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:00.333555 kernel: audit: type=1300 audit(1765889220.299:570): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000172218 a2=98 a3=0 items=0 ppid=3339 pid=3853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:00.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534323266656264383435646139303031313631303362653262626430 Dec 16 12:47:00.345887 kernel: audit: type=1327 audit(1765889220.299:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534323266656264383435646139303031313631303362653262626430 Dec 16 12:47:00.345962 kernel: audit: type=1334 audit(1765889220.299:571): prog-id=173 op=UNLOAD Dec 16 12:47:00.345986 kernel: audit: type=1300 audit(1765889220.299:571): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:00.299000 audit: BPF prog-id=173 op=UNLOAD Dec 16 12:47:00.299000 audit[3853]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:00.352696 kernel: audit: type=1327 audit(1765889220.299:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534323266656264383435646139303031313631303362653262626430 Dec 16 12:47:00.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534323266656264383435646139303031313631303362653262626430 Dec 16 12:47:00.299000 audit: BPF prog-id=172 op=UNLOAD Dec 16 12:47:00.358709 kernel: audit: type=1334 audit(1765889220.299:572): prog-id=172 op=UNLOAD Dec 16 12:47:00.299000 audit[3853]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3339 pid=3853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:00.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534323266656264383435646139303031313631303362653262626430 Dec 16 12:47:00.299000 audit: BPF prog-id=174 op=LOAD Dec 16 12:47:00.299000 audit[3853]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001726e8 a2=98 a3=0 items=0 ppid=3339 pid=3853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:00.299000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534323266656264383435646139303031313631303362653262626430 Dec 16 12:47:00.383629 containerd[1624]: time="2025-12-16T12:47:00.383580592Z" level=info msg="StartContainer for \"5422febd845da900116103be2bbd0ad6f577654cb9f5daa3083af9a7d07a831e\" returns successfully" Dec 16 12:47:00.581817 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 12:47:00.581917 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 12:47:00.668082 kubelet[2807]: I1216 12:47:00.667841 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-v4brf" podStartSLOduration=1.718792176 podStartE2EDuration="22.666850731s" podCreationTimestamp="2025-12-16 12:46:38 +0000 UTC" firstStartedPulling="2025-12-16 12:46:39.027831081 +0000 UTC m=+22.778090074" lastFinishedPulling="2025-12-16 12:46:59.975889636 +0000 UTC m=+43.726148629" observedRunningTime="2025-12-16 12:47:00.664452957 +0000 UTC m=+44.414711960" watchObservedRunningTime="2025-12-16 12:47:00.666850731 +0000 UTC m=+44.417109725" Dec 16 12:47:00.920660 kubelet[2807]: I1216 12:47:00.920522 2807 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7b3fffe8-2720-473c-b01e-f0da86506b4f-whisker-backend-key-pair\") pod \"7b3fffe8-2720-473c-b01e-f0da86506b4f\" (UID: \"7b3fffe8-2720-473c-b01e-f0da86506b4f\") " Dec 16 12:47:00.920660 kubelet[2807]: I1216 12:47:00.920611 2807 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtmhd\" (UniqueName: \"kubernetes.io/projected/7b3fffe8-2720-473c-b01e-f0da86506b4f-kube-api-access-qtmhd\") pod \"7b3fffe8-2720-473c-b01e-f0da86506b4f\" (UID: \"7b3fffe8-2720-473c-b01e-f0da86506b4f\") " Dec 16 12:47:00.920660 kubelet[2807]: I1216 12:47:00.920654 2807 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b3fffe8-2720-473c-b01e-f0da86506b4f-whisker-ca-bundle\") pod \"7b3fffe8-2720-473c-b01e-f0da86506b4f\" (UID: \"7b3fffe8-2720-473c-b01e-f0da86506b4f\") " Dec 16 12:47:00.930398 systemd[1]: var-lib-kubelet-pods-7b3fffe8\x2d2720\x2d473c\x2db01e\x2df0da86506b4f-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqtmhd.mount: Deactivated successfully. Dec 16 12:47:00.938963 kubelet[2807]: I1216 12:47:00.936501 2807 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3fffe8-2720-473c-b01e-f0da86506b4f-kube-api-access-qtmhd" (OuterVolumeSpecName: "kube-api-access-qtmhd") pod "7b3fffe8-2720-473c-b01e-f0da86506b4f" (UID: "7b3fffe8-2720-473c-b01e-f0da86506b4f"). InnerVolumeSpecName "kube-api-access-qtmhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 12:47:00.939321 kubelet[2807]: I1216 12:47:00.939228 2807 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3fffe8-2720-473c-b01e-f0da86506b4f-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "7b3fffe8-2720-473c-b01e-f0da86506b4f" (UID: "7b3fffe8-2720-473c-b01e-f0da86506b4f"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 12:47:00.967183 systemd[1]: var-lib-kubelet-pods-7b3fffe8\x2d2720\x2d473c\x2db01e\x2df0da86506b4f-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 12:47:00.968830 kubelet[2807]: I1216 12:47:00.968700 2807 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3fffe8-2720-473c-b01e-f0da86506b4f-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "7b3fffe8-2720-473c-b01e-f0da86506b4f" (UID: "7b3fffe8-2720-473c-b01e-f0da86506b4f"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 12:47:01.021939 kubelet[2807]: I1216 12:47:01.021905 2807 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7b3fffe8-2720-473c-b01e-f0da86506b4f-whisker-backend-key-pair\") on node \"ci-4547-0-0-c-452f5360ea\" DevicePath \"\"" Dec 16 12:47:01.022266 kubelet[2807]: I1216 12:47:01.022253 2807 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtmhd\" (UniqueName: \"kubernetes.io/projected/7b3fffe8-2720-473c-b01e-f0da86506b4f-kube-api-access-qtmhd\") on node \"ci-4547-0-0-c-452f5360ea\" DevicePath \"\"" Dec 16 12:47:01.022407 kubelet[2807]: I1216 12:47:01.022396 2807 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b3fffe8-2720-473c-b01e-f0da86506b4f-whisker-ca-bundle\") on node \"ci-4547-0-0-c-452f5360ea\" DevicePath \"\"" Dec 16 12:47:01.626709 systemd[1]: Removed slice kubepods-besteffort-pod7b3fffe8_2720_473c_b01e_f0da86506b4f.slice - libcontainer container kubepods-besteffort-pod7b3fffe8_2720_473c_b01e_f0da86506b4f.slice. Dec 16 12:47:01.715115 systemd[1]: Created slice kubepods-besteffort-pod373b77f0_3f56_4579_96a0_a033d280b187.slice - libcontainer container kubepods-besteffort-pod373b77f0_3f56_4579_96a0_a033d280b187.slice. Dec 16 12:47:01.828711 kubelet[2807]: I1216 12:47:01.828656 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/373b77f0-3f56-4579-96a0-a033d280b187-whisker-backend-key-pair\") pod \"whisker-78c5bf4bd4-c4c2s\" (UID: \"373b77f0-3f56-4579-96a0-a033d280b187\") " pod="calico-system/whisker-78c5bf4bd4-c4c2s" Dec 16 12:47:01.828711 kubelet[2807]: I1216 12:47:01.828710 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/373b77f0-3f56-4579-96a0-a033d280b187-whisker-ca-bundle\") pod \"whisker-78c5bf4bd4-c4c2s\" (UID: \"373b77f0-3f56-4579-96a0-a033d280b187\") " pod="calico-system/whisker-78c5bf4bd4-c4c2s" Dec 16 12:47:01.829102 kubelet[2807]: I1216 12:47:01.828756 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mvnh\" (UniqueName: \"kubernetes.io/projected/373b77f0-3f56-4579-96a0-a033d280b187-kube-api-access-5mvnh\") pod \"whisker-78c5bf4bd4-c4c2s\" (UID: \"373b77f0-3f56-4579-96a0-a033d280b187\") " pod="calico-system/whisker-78c5bf4bd4-c4c2s" Dec 16 12:47:02.022736 containerd[1624]: time="2025-12-16T12:47:02.022381376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78c5bf4bd4-c4c2s,Uid:373b77f0-3f56-4579-96a0-a033d280b187,Namespace:calico-system,Attempt:0,}" Dec 16 12:47:02.356069 systemd-networkd[1521]: cali749c2902b23: Link UP Dec 16 12:47:02.356757 systemd-networkd[1521]: cali749c2902b23: Gained carrier Dec 16 12:47:02.384691 containerd[1624]: 2025-12-16 12:47:02.060 [INFO][3968] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:47:02.384691 containerd[1624]: 2025-12-16 12:47:02.089 [INFO][3968] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0 whisker-78c5bf4bd4- calico-system 373b77f0-3f56-4579-96a0-a033d280b187 883 0 2025-12-16 12:47:01 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:78c5bf4bd4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-c-452f5360ea whisker-78c5bf4bd4-c4c2s eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali749c2902b23 [] [] }} ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Namespace="calico-system" Pod="whisker-78c5bf4bd4-c4c2s" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-" Dec 16 12:47:02.384691 containerd[1624]: 2025-12-16 12:47:02.089 [INFO][3968] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Namespace="calico-system" Pod="whisker-78c5bf4bd4-c4c2s" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" Dec 16 12:47:02.384691 containerd[1624]: 2025-12-16 12:47:02.287 [INFO][3981] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" HandleID="k8s-pod-network.0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Workload="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" Dec 16 12:47:02.384909 containerd[1624]: 2025-12-16 12:47:02.290 [INFO][3981] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" HandleID="k8s-pod-network.0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Workload="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00035ae60), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-c-452f5360ea", "pod":"whisker-78c5bf4bd4-c4c2s", "timestamp":"2025-12-16 12:47:02.287918094 +0000 UTC"}, Hostname:"ci-4547-0-0-c-452f5360ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:47:02.384909 containerd[1624]: 2025-12-16 12:47:02.290 [INFO][3981] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:47:02.384909 containerd[1624]: 2025-12-16 12:47:02.290 [INFO][3981] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:47:02.384909 containerd[1624]: 2025-12-16 12:47:02.292 [INFO][3981] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-c-452f5360ea' Dec 16 12:47:02.384909 containerd[1624]: 2025-12-16 12:47:02.306 [INFO][3981] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:02.384909 containerd[1624]: 2025-12-16 12:47:02.315 [INFO][3981] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:02.384909 containerd[1624]: 2025-12-16 12:47:02.320 [INFO][3981] ipam/ipam.go 511: Trying affinity for 192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:02.384909 containerd[1624]: 2025-12-16 12:47:02.321 [INFO][3981] ipam/ipam.go 158: Attempting to load block cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:02.384909 containerd[1624]: 2025-12-16 12:47:02.323 [INFO][3981] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:02.385079 containerd[1624]: 2025-12-16 12:47:02.324 [INFO][3981] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.94.0/26 handle="k8s-pod-network.0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:02.385079 containerd[1624]: 2025-12-16 12:47:02.325 [INFO][3981] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f Dec 16 12:47:02.385079 containerd[1624]: 2025-12-16 12:47:02.331 [INFO][3981] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.94.0/26 handle="k8s-pod-network.0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:02.385079 containerd[1624]: 2025-12-16 12:47:02.335 [INFO][3981] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.94.1/26] block=192.168.94.0/26 handle="k8s-pod-network.0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:02.385079 containerd[1624]: 2025-12-16 12:47:02.335 [INFO][3981] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.94.1/26] handle="k8s-pod-network.0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:02.385079 containerd[1624]: 2025-12-16 12:47:02.335 [INFO][3981] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:47:02.385079 containerd[1624]: 2025-12-16 12:47:02.335 [INFO][3981] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.94.1/26] IPv6=[] ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" HandleID="k8s-pod-network.0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Workload="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" Dec 16 12:47:02.385691 containerd[1624]: 2025-12-16 12:47:02.338 [INFO][3968] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Namespace="calico-system" Pod="whisker-78c5bf4bd4-c4c2s" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0", GenerateName:"whisker-78c5bf4bd4-", Namespace:"calico-system", SelfLink:"", UID:"373b77f0-3f56-4579-96a0-a033d280b187", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 47, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78c5bf4bd4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"", Pod:"whisker-78c5bf4bd4-c4c2s", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.94.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali749c2902b23", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:02.385691 containerd[1624]: 2025-12-16 12:47:02.338 [INFO][3968] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.94.1/32] ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Namespace="calico-system" Pod="whisker-78c5bf4bd4-c4c2s" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" Dec 16 12:47:02.385762 containerd[1624]: 2025-12-16 12:47:02.338 [INFO][3968] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali749c2902b23 ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Namespace="calico-system" Pod="whisker-78c5bf4bd4-c4c2s" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" Dec 16 12:47:02.385762 containerd[1624]: 2025-12-16 12:47:02.358 [INFO][3968] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Namespace="calico-system" Pod="whisker-78c5bf4bd4-c4c2s" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" Dec 16 12:47:02.385837 containerd[1624]: 2025-12-16 12:47:02.360 [INFO][3968] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Namespace="calico-system" Pod="whisker-78c5bf4bd4-c4c2s" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0", GenerateName:"whisker-78c5bf4bd4-", Namespace:"calico-system", SelfLink:"", UID:"373b77f0-3f56-4579-96a0-a033d280b187", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 47, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78c5bf4bd4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f", Pod:"whisker-78c5bf4bd4-c4c2s", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.94.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali749c2902b23", MAC:"52:c1:12:72:1e:56", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:02.385885 containerd[1624]: 2025-12-16 12:47:02.381 [INFO][3968] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" Namespace="calico-system" Pod="whisker-78c5bf4bd4-c4c2s" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-whisker--78c5bf4bd4--c4c2s-eth0" Dec 16 12:47:02.442196 kubelet[2807]: I1216 12:47:02.441942 2807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3fffe8-2720-473c-b01e-f0da86506b4f" path="/var/lib/kubelet/pods/7b3fffe8-2720-473c-b01e-f0da86506b4f/volumes" Dec 16 12:47:02.523675 containerd[1624]: time="2025-12-16T12:47:02.523636088Z" level=info msg="connecting to shim 0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f" address="unix:///run/containerd/s/fc7ed98781d7305749417203785435fb8552eed38044e2a8b034eec0019cb7a6" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:47:02.549694 systemd[1]: Started cri-containerd-0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f.scope - libcontainer container 0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f. Dec 16 12:47:02.570000 audit: BPF prog-id=175 op=LOAD Dec 16 12:47:02.570000 audit: BPF prog-id=176 op=LOAD Dec 16 12:47:02.570000 audit[4106]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4095 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:02.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063323261616165353562666233333863366637633435383331663663 Dec 16 12:47:02.570000 audit: BPF prog-id=176 op=UNLOAD Dec 16 12:47:02.570000 audit[4106]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4095 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:02.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063323261616165353562666233333863366637633435383331663663 Dec 16 12:47:02.570000 audit: BPF prog-id=177 op=LOAD Dec 16 12:47:02.570000 audit[4106]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4095 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:02.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063323261616165353562666233333863366637633435383331663663 Dec 16 12:47:02.570000 audit: BPF prog-id=178 op=LOAD Dec 16 12:47:02.570000 audit[4106]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4095 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:02.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063323261616165353562666233333863366637633435383331663663 Dec 16 12:47:02.570000 audit: BPF prog-id=178 op=UNLOAD Dec 16 12:47:02.570000 audit[4106]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4095 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:02.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063323261616165353562666233333863366637633435383331663663 Dec 16 12:47:02.570000 audit: BPF prog-id=177 op=UNLOAD Dec 16 12:47:02.570000 audit[4106]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4095 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:02.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063323261616165353562666233333863366637633435383331663663 Dec 16 12:47:02.570000 audit: BPF prog-id=179 op=LOAD Dec 16 12:47:02.570000 audit[4106]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4095 pid=4106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:02.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063323261616165353562666233333863366637633435383331663663 Dec 16 12:47:02.614888 containerd[1624]: time="2025-12-16T12:47:02.614714514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78c5bf4bd4-c4c2s,Uid:373b77f0-3f56-4579-96a0-a033d280b187,Namespace:calico-system,Attempt:0,} returns sandbox id \"0c22aaae55bfb338c6f7c45831f6cbae1c0e9495a4243ad42a900ca9e05a8d9f\"" Dec 16 12:47:02.618344 containerd[1624]: time="2025-12-16T12:47:02.617383659Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:47:03.085226 containerd[1624]: time="2025-12-16T12:47:03.085159516Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:03.086420 containerd[1624]: time="2025-12-16T12:47:03.086377619Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:47:03.086629 containerd[1624]: time="2025-12-16T12:47:03.086519877Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:03.094530 kubelet[2807]: E1216 12:47:03.094477 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:47:03.094796 kubelet[2807]: E1216 12:47:03.094574 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:47:03.094796 kubelet[2807]: E1216 12:47:03.094659 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-78c5bf4bd4-c4c2s_calico-system(373b77f0-3f56-4579-96a0-a033d280b187): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:03.095762 containerd[1624]: time="2025-12-16T12:47:03.095665400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:47:03.416124 containerd[1624]: time="2025-12-16T12:47:03.415979033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f54fc4f9-7wdcc,Uid:1081c7e3-0645-4858-ac62-90f635bb19a9,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:47:03.520165 containerd[1624]: time="2025-12-16T12:47:03.520013137Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:03.521456 containerd[1624]: time="2025-12-16T12:47:03.521221431Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:47:03.521677 containerd[1624]: time="2025-12-16T12:47:03.521226390Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:03.522037 kubelet[2807]: E1216 12:47:03.521867 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:47:03.522567 kubelet[2807]: E1216 12:47:03.522038 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:47:03.522567 kubelet[2807]: E1216 12:47:03.522115 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-78c5bf4bd4-c4c2s_calico-system(373b77f0-3f56-4579-96a0-a033d280b187): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:03.522567 kubelet[2807]: E1216 12:47:03.522149 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:47:03.608236 systemd-networkd[1521]: cali3e491b6a78c: Link UP Dec 16 12:47:03.608691 systemd-networkd[1521]: cali3e491b6a78c: Gained carrier Dec 16 12:47:03.626681 containerd[1624]: 2025-12-16 12:47:03.466 [INFO][4167] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:47:03.626681 containerd[1624]: 2025-12-16 12:47:03.481 [INFO][4167] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0 calico-apiserver-9f54fc4f9- calico-apiserver 1081c7e3-0645-4858-ac62-90f635bb19a9 807 0 2025-12-16 12:46:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9f54fc4f9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-c-452f5360ea calico-apiserver-9f54fc4f9-7wdcc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3e491b6a78c [] [] }} ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-7wdcc" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-" Dec 16 12:47:03.626681 containerd[1624]: 2025-12-16 12:47:03.481 [INFO][4167] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-7wdcc" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" Dec 16 12:47:03.626681 containerd[1624]: 2025-12-16 12:47:03.521 [INFO][4185] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" HandleID="k8s-pod-network.fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Workload="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" Dec 16 12:47:03.626987 containerd[1624]: 2025-12-16 12:47:03.521 [INFO][4185] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" HandleID="k8s-pod-network.fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Workload="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e8fe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-c-452f5360ea", "pod":"calico-apiserver-9f54fc4f9-7wdcc", "timestamp":"2025-12-16 12:47:03.521015682 +0000 UTC"}, Hostname:"ci-4547-0-0-c-452f5360ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:47:03.626987 containerd[1624]: 2025-12-16 12:47:03.521 [INFO][4185] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:47:03.626987 containerd[1624]: 2025-12-16 12:47:03.521 [INFO][4185] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:47:03.626987 containerd[1624]: 2025-12-16 12:47:03.521 [INFO][4185] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-c-452f5360ea' Dec 16 12:47:03.626987 containerd[1624]: 2025-12-16 12:47:03.528 [INFO][4185] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:03.626987 containerd[1624]: 2025-12-16 12:47:03.583 [INFO][4185] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:03.626987 containerd[1624]: 2025-12-16 12:47:03.587 [INFO][4185] ipam/ipam.go 511: Trying affinity for 192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:03.626987 containerd[1624]: 2025-12-16 12:47:03.589 [INFO][4185] ipam/ipam.go 158: Attempting to load block cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:03.626987 containerd[1624]: 2025-12-16 12:47:03.591 [INFO][4185] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:03.627289 containerd[1624]: 2025-12-16 12:47:03.591 [INFO][4185] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.94.0/26 handle="k8s-pod-network.fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:03.627289 containerd[1624]: 2025-12-16 12:47:03.592 [INFO][4185] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6 Dec 16 12:47:03.627289 containerd[1624]: 2025-12-16 12:47:03.596 [INFO][4185] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.94.0/26 handle="k8s-pod-network.fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:03.627289 containerd[1624]: 2025-12-16 12:47:03.600 [INFO][4185] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.94.2/26] block=192.168.94.0/26 handle="k8s-pod-network.fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:03.627289 containerd[1624]: 2025-12-16 12:47:03.600 [INFO][4185] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.94.2/26] handle="k8s-pod-network.fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:03.627289 containerd[1624]: 2025-12-16 12:47:03.600 [INFO][4185] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:47:03.627289 containerd[1624]: 2025-12-16 12:47:03.601 [INFO][4185] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.94.2/26] IPv6=[] ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" HandleID="k8s-pod-network.fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Workload="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" Dec 16 12:47:03.628212 containerd[1624]: 2025-12-16 12:47:03.604 [INFO][4167] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-7wdcc" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0", GenerateName:"calico-apiserver-9f54fc4f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"1081c7e3-0645-4858-ac62-90f635bb19a9", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9f54fc4f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"", Pod:"calico-apiserver-9f54fc4f9-7wdcc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3e491b6a78c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:03.628294 containerd[1624]: 2025-12-16 12:47:03.604 [INFO][4167] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.94.2/32] ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-7wdcc" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" Dec 16 12:47:03.628294 containerd[1624]: 2025-12-16 12:47:03.604 [INFO][4167] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3e491b6a78c ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-7wdcc" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" Dec 16 12:47:03.628294 containerd[1624]: 2025-12-16 12:47:03.609 [INFO][4167] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-7wdcc" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" Dec 16 12:47:03.628921 containerd[1624]: 2025-12-16 12:47:03.610 [INFO][4167] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-7wdcc" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0", GenerateName:"calico-apiserver-9f54fc4f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"1081c7e3-0645-4858-ac62-90f635bb19a9", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9f54fc4f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6", Pod:"calico-apiserver-9f54fc4f9-7wdcc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3e491b6a78c", MAC:"86:d2:c3:b0:1b:7c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:03.629012 containerd[1624]: 2025-12-16 12:47:03.622 [INFO][4167] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-7wdcc" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--7wdcc-eth0" Dec 16 12:47:03.653577 kubelet[2807]: E1216 12:47:03.652364 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:47:03.654934 containerd[1624]: time="2025-12-16T12:47:03.654904947Z" level=info msg="connecting to shim fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6" address="unix:///run/containerd/s/c7767cdd6ea00baef266e1fa982d47ecf312dbd6519356eab1d43af542e76a0d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:47:03.682679 systemd[1]: Started cri-containerd-fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6.scope - libcontainer container fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6. Dec 16 12:47:03.695000 audit: BPF prog-id=180 op=LOAD Dec 16 12:47:03.695000 audit: BPF prog-id=181 op=LOAD Dec 16 12:47:03.695000 audit[4218]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:03.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663356231646335623765333337336133653763626134646535313637 Dec 16 12:47:03.695000 audit: BPF prog-id=181 op=UNLOAD Dec 16 12:47:03.695000 audit[4218]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:03.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663356231646335623765333337336133653763626134646535313637 Dec 16 12:47:03.696000 audit: BPF prog-id=182 op=LOAD Dec 16 12:47:03.696000 audit[4218]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:03.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663356231646335623765333337336133653763626134646535313637 Dec 16 12:47:03.696000 audit: BPF prog-id=183 op=LOAD Dec 16 12:47:03.696000 audit[4218]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:03.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663356231646335623765333337336133653763626134646535313637 Dec 16 12:47:03.696000 audit: BPF prog-id=183 op=UNLOAD Dec 16 12:47:03.696000 audit[4218]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:03.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663356231646335623765333337336133653763626134646535313637 Dec 16 12:47:03.696000 audit: BPF prog-id=182 op=UNLOAD Dec 16 12:47:03.696000 audit[4218]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:03.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663356231646335623765333337336133653763626134646535313637 Dec 16 12:47:03.696000 audit: BPF prog-id=184 op=LOAD Dec 16 12:47:03.696000 audit[4218]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=4207 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:03.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663356231646335623765333337336133653763626134646535313637 Dec 16 12:47:03.704000 audit[4239]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=4239 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:03.704000 audit[4239]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7fff0dce8110 a2=0 a3=7fff0dce80fc items=0 ppid=2972 pid=4239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:03.704000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:03.708000 audit[4239]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=4239 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:03.708000 audit[4239]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff0dce8110 a2=0 a3=0 items=0 ppid=2972 pid=4239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:03.708000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:03.730332 containerd[1624]: time="2025-12-16T12:47:03.730246468Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f54fc4f9-7wdcc,Uid:1081c7e3-0645-4858-ac62-90f635bb19a9,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fc5b1dc5b7e3373a3e7cba4de5167e55aa8e244c03464129e5bce6fa560926b6\"" Dec 16 12:47:03.731801 containerd[1624]: time="2025-12-16T12:47:03.731755881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:47:03.740084 systemd-networkd[1521]: cali749c2902b23: Gained IPv6LL Dec 16 12:47:04.404256 containerd[1624]: time="2025-12-16T12:47:04.404200128Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:04.405579 containerd[1624]: time="2025-12-16T12:47:04.405511686Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:47:04.405656 containerd[1624]: time="2025-12-16T12:47:04.405636512Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:04.405974 kubelet[2807]: E1216 12:47:04.405920 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:04.405974 kubelet[2807]: E1216 12:47:04.405975 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:04.406262 kubelet[2807]: E1216 12:47:04.406066 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-9f54fc4f9-7wdcc_calico-apiserver(1081c7e3-0645-4858-ac62-90f635bb19a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:04.406262 kubelet[2807]: E1216 12:47:04.406107 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:47:04.415441 containerd[1624]: time="2025-12-16T12:47:04.415362137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-p72td,Uid:2ce9c252-c98d-4be2-ac79-61841133c167,Namespace:kube-system,Attempt:0,}" Dec 16 12:47:04.416809 containerd[1624]: time="2025-12-16T12:47:04.416751833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sdw6q,Uid:db9e60f0-9cc3-4000-b32c-9ba313d4b676,Namespace:calico-system,Attempt:0,}" Dec 16 12:47:04.417773 containerd[1624]: time="2025-12-16T12:47:04.417652926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h22qr,Uid:eff3d741-729e-4ed9-a6a3-d314f99d7c29,Namespace:calico-system,Attempt:0,}" Dec 16 12:47:04.577469 systemd-networkd[1521]: calibcb1c768bc9: Link UP Dec 16 12:47:04.578229 systemd-networkd[1521]: calibcb1c768bc9: Gained carrier Dec 16 12:47:04.598256 containerd[1624]: 2025-12-16 12:47:04.472 [INFO][4250] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:47:04.598256 containerd[1624]: 2025-12-16 12:47:04.485 [INFO][4250] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0 goldmane-7c778bb748- calico-system db9e60f0-9cc3-4000-b32c-9ba313d4b676 809 0 2025-12-16 12:46:36 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-c-452f5360ea goldmane-7c778bb748-sdw6q eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calibcb1c768bc9 [] [] }} ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Namespace="calico-system" Pod="goldmane-7c778bb748-sdw6q" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-" Dec 16 12:47:04.598256 containerd[1624]: 2025-12-16 12:47:04.485 [INFO][4250] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Namespace="calico-system" Pod="goldmane-7c778bb748-sdw6q" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" Dec 16 12:47:04.598256 containerd[1624]: 2025-12-16 12:47:04.530 [INFO][4289] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" HandleID="k8s-pod-network.0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Workload="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" Dec 16 12:47:04.598464 containerd[1624]: 2025-12-16 12:47:04.530 [INFO][4289] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" HandleID="k8s-pod-network.0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Workload="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5770), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-c-452f5360ea", "pod":"goldmane-7c778bb748-sdw6q", "timestamp":"2025-12-16 12:47:04.530236155 +0000 UTC"}, Hostname:"ci-4547-0-0-c-452f5360ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:47:04.598464 containerd[1624]: 2025-12-16 12:47:04.530 [INFO][4289] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:47:04.598464 containerd[1624]: 2025-12-16 12:47:04.530 [INFO][4289] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:47:04.598464 containerd[1624]: 2025-12-16 12:47:04.530 [INFO][4289] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-c-452f5360ea' Dec 16 12:47:04.598464 containerd[1624]: 2025-12-16 12:47:04.536 [INFO][4289] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.598464 containerd[1624]: 2025-12-16 12:47:04.542 [INFO][4289] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.598464 containerd[1624]: 2025-12-16 12:47:04.548 [INFO][4289] ipam/ipam.go 511: Trying affinity for 192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.598464 containerd[1624]: 2025-12-16 12:47:04.550 [INFO][4289] ipam/ipam.go 158: Attempting to load block cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.598464 containerd[1624]: 2025-12-16 12:47:04.552 [INFO][4289] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.599457 containerd[1624]: 2025-12-16 12:47:04.552 [INFO][4289] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.94.0/26 handle="k8s-pod-network.0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.599457 containerd[1624]: 2025-12-16 12:47:04.555 [INFO][4289] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c Dec 16 12:47:04.599457 containerd[1624]: 2025-12-16 12:47:04.563 [INFO][4289] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.94.0/26 handle="k8s-pod-network.0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.599457 containerd[1624]: 2025-12-16 12:47:04.569 [INFO][4289] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.94.3/26] block=192.168.94.0/26 handle="k8s-pod-network.0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.599457 containerd[1624]: 2025-12-16 12:47:04.569 [INFO][4289] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.94.3/26] handle="k8s-pod-network.0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.599457 containerd[1624]: 2025-12-16 12:47:04.569 [INFO][4289] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:47:04.599457 containerd[1624]: 2025-12-16 12:47:04.569 [INFO][4289] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.94.3/26] IPv6=[] ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" HandleID="k8s-pod-network.0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Workload="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" Dec 16 12:47:04.599756 containerd[1624]: 2025-12-16 12:47:04.572 [INFO][4250] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Namespace="calico-system" Pod="goldmane-7c778bb748-sdw6q" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"db9e60f0-9cc3-4000-b32c-9ba313d4b676", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"", Pod:"goldmane-7c778bb748-sdw6q", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.94.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calibcb1c768bc9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:04.600117 containerd[1624]: 2025-12-16 12:47:04.572 [INFO][4250] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.94.3/32] ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Namespace="calico-system" Pod="goldmane-7c778bb748-sdw6q" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" Dec 16 12:47:04.600117 containerd[1624]: 2025-12-16 12:47:04.572 [INFO][4250] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibcb1c768bc9 ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Namespace="calico-system" Pod="goldmane-7c778bb748-sdw6q" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" Dec 16 12:47:04.600117 containerd[1624]: 2025-12-16 12:47:04.579 [INFO][4250] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Namespace="calico-system" Pod="goldmane-7c778bb748-sdw6q" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" Dec 16 12:47:04.600188 containerd[1624]: 2025-12-16 12:47:04.582 [INFO][4250] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Namespace="calico-system" Pod="goldmane-7c778bb748-sdw6q" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"db9e60f0-9cc3-4000-b32c-9ba313d4b676", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c", Pod:"goldmane-7c778bb748-sdw6q", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.94.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calibcb1c768bc9", MAC:"d2:e7:5b:a0:94:ae", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:04.600236 containerd[1624]: 2025-12-16 12:47:04.595 [INFO][4250] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" Namespace="calico-system" Pod="goldmane-7c778bb748-sdw6q" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-goldmane--7c778bb748--sdw6q-eth0" Dec 16 12:47:04.628675 containerd[1624]: time="2025-12-16T12:47:04.628624455Z" level=info msg="connecting to shim 0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c" address="unix:///run/containerd/s/a5b09f0eb19212bcd5aa273b88ea8ae4fd1203070ff0491ab03d6ded2c3056f8" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:47:04.653918 kubelet[2807]: E1216 12:47:04.653683 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:47:04.684774 systemd[1]: Started cri-containerd-0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c.scope - libcontainer container 0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c. Dec 16 12:47:04.705742 systemd-networkd[1521]: calibbaeb0aa8d0: Link UP Dec 16 12:47:04.707294 systemd-networkd[1521]: calibbaeb0aa8d0: Gained carrier Dec 16 12:47:04.727784 containerd[1624]: 2025-12-16 12:47:04.480 [INFO][4254] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:47:04.727784 containerd[1624]: 2025-12-16 12:47:04.503 [INFO][4254] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0 coredns-66bc5c9577- kube-system 2ce9c252-c98d-4be2-ac79-61841133c167 806 0 2025-12-16 12:46:24 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-c-452f5360ea coredns-66bc5c9577-p72td eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibbaeb0aa8d0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Namespace="kube-system" Pod="coredns-66bc5c9577-p72td" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-" Dec 16 12:47:04.727784 containerd[1624]: 2025-12-16 12:47:04.504 [INFO][4254] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Namespace="kube-system" Pod="coredns-66bc5c9577-p72td" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" Dec 16 12:47:04.727784 containerd[1624]: 2025-12-16 12:47:04.562 [INFO][4296] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" HandleID="k8s-pod-network.720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Workload="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" Dec 16 12:47:04.728287 containerd[1624]: 2025-12-16 12:47:04.564 [INFO][4296] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" HandleID="k8s-pod-network.720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Workload="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5870), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-c-452f5360ea", "pod":"coredns-66bc5c9577-p72td", "timestamp":"2025-12-16 12:47:04.562300049 +0000 UTC"}, Hostname:"ci-4547-0-0-c-452f5360ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:47:04.728287 containerd[1624]: 2025-12-16 12:47:04.566 [INFO][4296] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:47:04.728287 containerd[1624]: 2025-12-16 12:47:04.569 [INFO][4296] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:47:04.728287 containerd[1624]: 2025-12-16 12:47:04.569 [INFO][4296] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-c-452f5360ea' Dec 16 12:47:04.728287 containerd[1624]: 2025-12-16 12:47:04.641 [INFO][4296] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.728287 containerd[1624]: 2025-12-16 12:47:04.651 [INFO][4296] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.728287 containerd[1624]: 2025-12-16 12:47:04.664 [INFO][4296] ipam/ipam.go 511: Trying affinity for 192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.728287 containerd[1624]: 2025-12-16 12:47:04.667 [INFO][4296] ipam/ipam.go 158: Attempting to load block cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.728287 containerd[1624]: 2025-12-16 12:47:04.672 [INFO][4296] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.728497 containerd[1624]: 2025-12-16 12:47:04.672 [INFO][4296] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.94.0/26 handle="k8s-pod-network.720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.728497 containerd[1624]: 2025-12-16 12:47:04.678 [INFO][4296] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334 Dec 16 12:47:04.728497 containerd[1624]: 2025-12-16 12:47:04.684 [INFO][4296] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.94.0/26 handle="k8s-pod-network.720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.728497 containerd[1624]: 2025-12-16 12:47:04.692 [INFO][4296] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.94.4/26] block=192.168.94.0/26 handle="k8s-pod-network.720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.728497 containerd[1624]: 2025-12-16 12:47:04.692 [INFO][4296] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.94.4/26] handle="k8s-pod-network.720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.728497 containerd[1624]: 2025-12-16 12:47:04.692 [INFO][4296] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:47:04.728497 containerd[1624]: 2025-12-16 12:47:04.692 [INFO][4296] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.94.4/26] IPv6=[] ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" HandleID="k8s-pod-network.720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Workload="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" Dec 16 12:47:04.728663 containerd[1624]: 2025-12-16 12:47:04.697 [INFO][4254] cni-plugin/k8s.go 418: Populated endpoint ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Namespace="kube-system" Pod="coredns-66bc5c9577-p72td" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2ce9c252-c98d-4be2-ac79-61841133c167", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"", Pod:"coredns-66bc5c9577-p72td", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibbaeb0aa8d0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:04.728663 containerd[1624]: 2025-12-16 12:47:04.697 [INFO][4254] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.94.4/32] ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Namespace="kube-system" Pod="coredns-66bc5c9577-p72td" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" Dec 16 12:47:04.728663 containerd[1624]: 2025-12-16 12:47:04.697 [INFO][4254] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibbaeb0aa8d0 ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Namespace="kube-system" Pod="coredns-66bc5c9577-p72td" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" Dec 16 12:47:04.728663 containerd[1624]: 2025-12-16 12:47:04.708 [INFO][4254] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Namespace="kube-system" Pod="coredns-66bc5c9577-p72td" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" Dec 16 12:47:04.728663 containerd[1624]: 2025-12-16 12:47:04.711 [INFO][4254] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Namespace="kube-system" Pod="coredns-66bc5c9577-p72td" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2ce9c252-c98d-4be2-ac79-61841133c167", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334", Pod:"coredns-66bc5c9577-p72td", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibbaeb0aa8d0", MAC:"c2:df:42:77:dd:f9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:04.728822 containerd[1624]: 2025-12-16 12:47:04.722 [INFO][4254] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" Namespace="kube-system" Pod="coredns-66bc5c9577-p72td" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--p72td-eth0" Dec 16 12:47:04.732000 audit[4380]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=4380 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:04.732000 audit[4380]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7fffe4f30c40 a2=0 a3=7fffe4f30c2c items=0 ppid=2972 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.732000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:04.736000 audit: BPF prog-id=185 op=LOAD Dec 16 12:47:04.736000 audit[4380]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=4380 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:04.736000 audit[4380]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffe4f30c40 a2=0 a3=0 items=0 ppid=2972 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.736000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:04.738000 audit: BPF prog-id=186 op=LOAD Dec 16 12:47:04.738000 audit[4353]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4342 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.738000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065666363316334323233333963346663663634373630333830346538 Dec 16 12:47:04.738000 audit: BPF prog-id=186 op=UNLOAD Dec 16 12:47:04.738000 audit[4353]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4342 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.738000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065666363316334323233333963346663663634373630333830346538 Dec 16 12:47:04.740000 audit: BPF prog-id=187 op=LOAD Dec 16 12:47:04.740000 audit[4353]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4342 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.740000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065666363316334323233333963346663663634373630333830346538 Dec 16 12:47:04.759000 audit: BPF prog-id=188 op=LOAD Dec 16 12:47:04.759000 audit[4353]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4342 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065666363316334323233333963346663663634373630333830346538 Dec 16 12:47:04.760000 audit: BPF prog-id=188 op=UNLOAD Dec 16 12:47:04.760000 audit[4353]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4342 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065666363316334323233333963346663663634373630333830346538 Dec 16 12:47:04.760000 audit: BPF prog-id=187 op=UNLOAD Dec 16 12:47:04.760000 audit[4353]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4342 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065666363316334323233333963346663663634373630333830346538 Dec 16 12:47:04.760000 audit: BPF prog-id=189 op=LOAD Dec 16 12:47:04.760000 audit[4353]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4342 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065666363316334323233333963346663663634373630333830346538 Dec 16 12:47:04.786808 containerd[1624]: time="2025-12-16T12:47:04.786753452Z" level=info msg="connecting to shim 720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334" address="unix:///run/containerd/s/f2f3d71513d6c4c0a9e632ca474e484c99eb96212f556229a81a12e3f55a0da1" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:47:04.827697 systemd-networkd[1521]: cali3e491b6a78c: Gained IPv6LL Dec 16 12:47:04.837424 systemd[1]: Started cri-containerd-720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334.scope - libcontainer container 720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334. Dec 16 12:47:04.857000 audit: BPF prog-id=190 op=LOAD Dec 16 12:47:04.858000 audit: BPF prog-id=191 op=LOAD Dec 16 12:47:04.858000 audit[4405]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=4391 pid=4405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.858000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732306266626263626338613161383830653138653066303762613633 Dec 16 12:47:04.858000 audit: BPF prog-id=191 op=UNLOAD Dec 16 12:47:04.858000 audit[4405]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4391 pid=4405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.858000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732306266626263626338613161383830653138653066303762613633 Dec 16 12:47:04.859000 audit: BPF prog-id=192 op=LOAD Dec 16 12:47:04.859000 audit[4405]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=4391 pid=4405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.859000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732306266626263626338613161383830653138653066303762613633 Dec 16 12:47:04.859000 audit: BPF prog-id=193 op=LOAD Dec 16 12:47:04.859000 audit[4405]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=4391 pid=4405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.859000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732306266626263626338613161383830653138653066303762613633 Dec 16 12:47:04.859000 audit: BPF prog-id=193 op=UNLOAD Dec 16 12:47:04.859000 audit[4405]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4391 pid=4405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.859000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732306266626263626338613161383830653138653066303762613633 Dec 16 12:47:04.859000 audit: BPF prog-id=192 op=UNLOAD Dec 16 12:47:04.859000 audit[4405]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4391 pid=4405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.859000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732306266626263626338613161383830653138653066303762613633 Dec 16 12:47:04.859000 audit: BPF prog-id=194 op=LOAD Dec 16 12:47:04.859000 audit[4405]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=4391 pid=4405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.859000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732306266626263626338613161383830653138653066303762613633 Dec 16 12:47:04.862363 containerd[1624]: time="2025-12-16T12:47:04.862338776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sdw6q,Uid:db9e60f0-9cc3-4000-b32c-9ba313d4b676,Namespace:calico-system,Attempt:0,} returns sandbox id \"0efcc1c422339c4fcf647603804e85fa14bd9cc0c72af075b23fc80d03729e9c\"" Dec 16 12:47:04.866076 containerd[1624]: time="2025-12-16T12:47:04.866052723Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:47:04.885662 systemd-networkd[1521]: cali3093ba0761a: Link UP Dec 16 12:47:04.887036 systemd-networkd[1521]: cali3093ba0761a: Gained carrier Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.490 [INFO][4258] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.515 [INFO][4258] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0 csi-node-driver- calico-system eff3d741-729e-4ed9-a6a3-d314f99d7c29 694 0 2025-12-16 12:46:38 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-c-452f5360ea csi-node-driver-h22qr eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali3093ba0761a [] [] }} ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Namespace="calico-system" Pod="csi-node-driver-h22qr" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.518 [INFO][4258] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Namespace="calico-system" Pod="csi-node-driver-h22qr" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.568 [INFO][4302] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" HandleID="k8s-pod-network.4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Workload="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.571 [INFO][4302] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" HandleID="k8s-pod-network.4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Workload="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000332870), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-c-452f5360ea", "pod":"csi-node-driver-h22qr", "timestamp":"2025-12-16 12:47:04.568891603 +0000 UTC"}, Hostname:"ci-4547-0-0-c-452f5360ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.571 [INFO][4302] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.693 [INFO][4302] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.694 [INFO][4302] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-c-452f5360ea' Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.757 [INFO][4302] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.773 [INFO][4302] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.789 [INFO][4302] ipam/ipam.go 511: Trying affinity for 192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.794 [INFO][4302] ipam/ipam.go 158: Attempting to load block cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.801 [INFO][4302] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.802 [INFO][4302] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.94.0/26 handle="k8s-pod-network.4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.807 [INFO][4302] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1 Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.820 [INFO][4302] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.94.0/26 handle="k8s-pod-network.4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.842 [INFO][4302] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.94.5/26] block=192.168.94.0/26 handle="k8s-pod-network.4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.846 [INFO][4302] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.94.5/26] handle="k8s-pod-network.4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.846 [INFO][4302] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:47:04.916551 containerd[1624]: 2025-12-16 12:47:04.846 [INFO][4302] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.94.5/26] IPv6=[] ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" HandleID="k8s-pod-network.4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Workload="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" Dec 16 12:47:04.917304 containerd[1624]: 2025-12-16 12:47:04.882 [INFO][4258] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Namespace="calico-system" Pod="csi-node-driver-h22qr" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eff3d741-729e-4ed9-a6a3-d314f99d7c29", ResourceVersion:"694", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"", Pod:"csi-node-driver-h22qr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.94.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3093ba0761a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:04.917304 containerd[1624]: 2025-12-16 12:47:04.882 [INFO][4258] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.94.5/32] ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Namespace="calico-system" Pod="csi-node-driver-h22qr" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" Dec 16 12:47:04.917304 containerd[1624]: 2025-12-16 12:47:04.882 [INFO][4258] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3093ba0761a ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Namespace="calico-system" Pod="csi-node-driver-h22qr" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" Dec 16 12:47:04.917304 containerd[1624]: 2025-12-16 12:47:04.888 [INFO][4258] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Namespace="calico-system" Pod="csi-node-driver-h22qr" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" Dec 16 12:47:04.917304 containerd[1624]: 2025-12-16 12:47:04.889 [INFO][4258] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Namespace="calico-system" Pod="csi-node-driver-h22qr" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eff3d741-729e-4ed9-a6a3-d314f99d7c29", ResourceVersion:"694", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1", Pod:"csi-node-driver-h22qr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.94.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3093ba0761a", MAC:"f2:63:be:e8:0e:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:04.917304 containerd[1624]: 2025-12-16 12:47:04.910 [INFO][4258] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" Namespace="calico-system" Pod="csi-node-driver-h22qr" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-csi--node--driver--h22qr-eth0" Dec 16 12:47:04.921564 containerd[1624]: time="2025-12-16T12:47:04.920982590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-p72td,Uid:2ce9c252-c98d-4be2-ac79-61841133c167,Namespace:kube-system,Attempt:0,} returns sandbox id \"720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334\"" Dec 16 12:47:04.927836 containerd[1624]: time="2025-12-16T12:47:04.927809399Z" level=info msg="CreateContainer within sandbox \"720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:47:04.947458 containerd[1624]: time="2025-12-16T12:47:04.947380447Z" level=info msg="connecting to shim 4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1" address="unix:///run/containerd/s/1713435ecc9f413d4fabd489198b4e0bee3497260274cccb483ad4d9cb12d108" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:47:04.952299 containerd[1624]: time="2025-12-16T12:47:04.952119571Z" level=info msg="Container d0f34cf7140b56d420ade73433b2c8a9adb9d984d4117b89461ebf14ee6ca389: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:47:04.958777 containerd[1624]: time="2025-12-16T12:47:04.958751663Z" level=info msg="CreateContainer within sandbox \"720bfbbcbc8a1a880e18e0f07ba63b8e6d297a2413c8dd417c2483e8f2f00334\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d0f34cf7140b56d420ade73433b2c8a9adb9d984d4117b89461ebf14ee6ca389\"" Dec 16 12:47:04.960056 containerd[1624]: time="2025-12-16T12:47:04.959810072Z" level=info msg="StartContainer for \"d0f34cf7140b56d420ade73433b2c8a9adb9d984d4117b89461ebf14ee6ca389\"" Dec 16 12:47:04.962666 containerd[1624]: time="2025-12-16T12:47:04.962517258Z" level=info msg="connecting to shim d0f34cf7140b56d420ade73433b2c8a9adb9d984d4117b89461ebf14ee6ca389" address="unix:///run/containerd/s/f2f3d71513d6c4c0a9e632ca474e484c99eb96212f556229a81a12e3f55a0da1" protocol=ttrpc version=3 Dec 16 12:47:04.972748 systemd[1]: Started cri-containerd-4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1.scope - libcontainer container 4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1. Dec 16 12:47:04.978133 systemd[1]: Started cri-containerd-d0f34cf7140b56d420ade73433b2c8a9adb9d984d4117b89461ebf14ee6ca389.scope - libcontainer container d0f34cf7140b56d420ade73433b2c8a9adb9d984d4117b89461ebf14ee6ca389. Dec 16 12:47:04.987000 audit: BPF prog-id=195 op=LOAD Dec 16 12:47:04.987000 audit: BPF prog-id=196 op=LOAD Dec 16 12:47:04.987000 audit[4468]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464643536376564386236393231366439353831343039366462323936 Dec 16 12:47:04.988000 audit: BPF prog-id=196 op=UNLOAD Dec 16 12:47:04.988000 audit[4468]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464643536376564386236393231366439353831343039366462323936 Dec 16 12:47:04.988000 audit: BPF prog-id=197 op=LOAD Dec 16 12:47:04.988000 audit[4468]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464643536376564386236393231366439353831343039366462323936 Dec 16 12:47:04.988000 audit: BPF prog-id=198 op=LOAD Dec 16 12:47:04.988000 audit[4468]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464643536376564386236393231366439353831343039366462323936 Dec 16 12:47:04.988000 audit: BPF prog-id=198 op=UNLOAD Dec 16 12:47:04.988000 audit[4468]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464643536376564386236393231366439353831343039366462323936 Dec 16 12:47:04.988000 audit: BPF prog-id=197 op=UNLOAD Dec 16 12:47:04.988000 audit[4468]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464643536376564386236393231366439353831343039366462323936 Dec 16 12:47:04.988000 audit: BPF prog-id=199 op=LOAD Dec 16 12:47:04.988000 audit[4468]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=4456 pid=4468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3464643536376564386236393231366439353831343039366462323936 Dec 16 12:47:04.990000 audit: BPF prog-id=200 op=LOAD Dec 16 12:47:04.991000 audit: BPF prog-id=201 op=LOAD Dec 16 12:47:04.991000 audit[4479]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4391 pid=4479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430663334636637313430623536643432306164653733343333623263 Dec 16 12:47:04.991000 audit: BPF prog-id=201 op=UNLOAD Dec 16 12:47:04.991000 audit[4479]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4391 pid=4479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430663334636637313430623536643432306164653733343333623263 Dec 16 12:47:04.991000 audit: BPF prog-id=202 op=LOAD Dec 16 12:47:04.991000 audit[4479]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4391 pid=4479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430663334636637313430623536643432306164653733343333623263 Dec 16 12:47:04.991000 audit: BPF prog-id=203 op=LOAD Dec 16 12:47:04.991000 audit[4479]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4391 pid=4479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430663334636637313430623536643432306164653733343333623263 Dec 16 12:47:04.991000 audit: BPF prog-id=203 op=UNLOAD Dec 16 12:47:04.991000 audit[4479]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4391 pid=4479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430663334636637313430623536643432306164653733343333623263 Dec 16 12:47:04.991000 audit: BPF prog-id=202 op=UNLOAD Dec 16 12:47:04.991000 audit[4479]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4391 pid=4479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430663334636637313430623536643432306164653733343333623263 Dec 16 12:47:04.991000 audit: BPF prog-id=204 op=LOAD Dec 16 12:47:04.991000 audit[4479]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4391 pid=4479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:04.991000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430663334636637313430623536643432306164653733343333623263 Dec 16 12:47:05.012362 containerd[1624]: time="2025-12-16T12:47:05.012257785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h22qr,Uid:eff3d741-729e-4ed9-a6a3-d314f99d7c29,Namespace:calico-system,Attempt:0,} returns sandbox id \"4dd567ed8b69216d95814096db296d5b3fedfdd049aa6ab12b101dd17f67f2d1\"" Dec 16 12:47:05.012713 containerd[1624]: time="2025-12-16T12:47:05.012337676Z" level=info msg="StartContainer for \"d0f34cf7140b56d420ade73433b2c8a9adb9d984d4117b89461ebf14ee6ca389\" returns successfully" Dec 16 12:47:05.315430 containerd[1624]: time="2025-12-16T12:47:05.315207138Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:05.316924 containerd[1624]: time="2025-12-16T12:47:05.316711902Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:47:05.316924 containerd[1624]: time="2025-12-16T12:47:05.316848880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:05.317205 kubelet[2807]: E1216 12:47:05.317122 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:47:05.317317 kubelet[2807]: E1216 12:47:05.317236 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:47:05.317426 kubelet[2807]: E1216 12:47:05.317388 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sdw6q_calico-system(db9e60f0-9cc3-4000-b32c-9ba313d4b676): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:05.317517 kubelet[2807]: E1216 12:47:05.317429 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:47:05.318497 containerd[1624]: time="2025-12-16T12:47:05.318427904Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:47:05.414308 containerd[1624]: time="2025-12-16T12:47:05.414267635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f54fc4f9-v6lkx,Uid:50e78f30-18dc-4823-b78b-3500e19d4f6f,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:47:05.531088 systemd-networkd[1521]: cali1854643f905: Link UP Dec 16 12:47:05.531785 systemd-networkd[1521]: cali1854643f905: Gained carrier Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.445 [INFO][4526] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.456 [INFO][4526] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0 calico-apiserver-9f54fc4f9- calico-apiserver 50e78f30-18dc-4823-b78b-3500e19d4f6f 811 0 2025-12-16 12:46:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9f54fc4f9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-c-452f5360ea calico-apiserver-9f54fc4f9-v6lkx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1854643f905 [] [] }} ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-v6lkx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.456 [INFO][4526] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-v6lkx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.485 [INFO][4538] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" HandleID="k8s-pod-network.bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Workload="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.485 [INFO][4538] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" HandleID="k8s-pod-network.bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Workload="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f590), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-c-452f5360ea", "pod":"calico-apiserver-9f54fc4f9-v6lkx", "timestamp":"2025-12-16 12:47:05.485782513 +0000 UTC"}, Hostname:"ci-4547-0-0-c-452f5360ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.485 [INFO][4538] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.486 [INFO][4538] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.486 [INFO][4538] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-c-452f5360ea' Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.493 [INFO][4538] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.497 [INFO][4538] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.501 [INFO][4538] ipam/ipam.go 511: Trying affinity for 192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.503 [INFO][4538] ipam/ipam.go 158: Attempting to load block cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.505 [INFO][4538] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.505 [INFO][4538] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.94.0/26 handle="k8s-pod-network.bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.507 [INFO][4538] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.513 [INFO][4538] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.94.0/26 handle="k8s-pod-network.bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.520 [INFO][4538] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.94.6/26] block=192.168.94.0/26 handle="k8s-pod-network.bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.521 [INFO][4538] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.94.6/26] handle="k8s-pod-network.bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.521 [INFO][4538] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:47:05.554474 containerd[1624]: 2025-12-16 12:47:05.521 [INFO][4538] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.94.6/26] IPv6=[] ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" HandleID="k8s-pod-network.bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Workload="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" Dec 16 12:47:05.554997 containerd[1624]: 2025-12-16 12:47:05.524 [INFO][4526] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-v6lkx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0", GenerateName:"calico-apiserver-9f54fc4f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"50e78f30-18dc-4823-b78b-3500e19d4f6f", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9f54fc4f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"", Pod:"calico-apiserver-9f54fc4f9-v6lkx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1854643f905", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:05.554997 containerd[1624]: 2025-12-16 12:47:05.525 [INFO][4526] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.94.6/32] ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-v6lkx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" Dec 16 12:47:05.554997 containerd[1624]: 2025-12-16 12:47:05.525 [INFO][4526] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1854643f905 ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-v6lkx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" Dec 16 12:47:05.554997 containerd[1624]: 2025-12-16 12:47:05.531 [INFO][4526] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-v6lkx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" Dec 16 12:47:05.554997 containerd[1624]: 2025-12-16 12:47:05.532 [INFO][4526] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-v6lkx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0", GenerateName:"calico-apiserver-9f54fc4f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"50e78f30-18dc-4823-b78b-3500e19d4f6f", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9f54fc4f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c", Pod:"calico-apiserver-9f54fc4f9-v6lkx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1854643f905", MAC:"16:23:3c:b4:9c:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:05.554997 containerd[1624]: 2025-12-16 12:47:05.549 [INFO][4526] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" Namespace="calico-apiserver" Pod="calico-apiserver-9f54fc4f9-v6lkx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--apiserver--9f54fc4f9--v6lkx-eth0" Dec 16 12:47:05.584776 containerd[1624]: time="2025-12-16T12:47:05.584675924Z" level=info msg="connecting to shim bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c" address="unix:///run/containerd/s/aec905dc034be157c9575c377f3f7cc1cf7b53175c140fc01879edb62728e21d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:47:05.628726 systemd[1]: Started cri-containerd-bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c.scope - libcontainer container bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c. Dec 16 12:47:05.666382 kubelet[2807]: E1216 12:47:05.666221 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:47:05.666382 kubelet[2807]: E1216 12:47:05.666328 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:47:05.682605 kernel: kauditd_printk_skb: 149 callbacks suppressed Dec 16 12:47:05.682695 kernel: audit: type=1334 audit(1765889225.679:626): prog-id=205 op=LOAD Dec 16 12:47:05.679000 audit: BPF prog-id=205 op=LOAD Dec 16 12:47:05.685302 kubelet[2807]: I1216 12:47:05.685253 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-p72td" podStartSLOduration=41.685237054 podStartE2EDuration="41.685237054s" podCreationTimestamp="2025-12-16 12:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:47:05.684875911 +0000 UTC m=+49.435134904" watchObservedRunningTime="2025-12-16 12:47:05.685237054 +0000 UTC m=+49.435496047" Dec 16 12:47:05.689650 kernel: audit: type=1334 audit(1765889225.679:627): prog-id=206 op=LOAD Dec 16 12:47:05.679000 audit: BPF prog-id=206 op=LOAD Dec 16 12:47:05.679000 audit[4570]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.697549 kernel: audit: type=1300 audit(1765889225.679:627): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.679000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.707660 kernel: audit: type=1327 audit(1765889225.679:627): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.710591 kernel: audit: type=1334 audit(1765889225.679:628): prog-id=206 op=UNLOAD Dec 16 12:47:05.679000 audit: BPF prog-id=206 op=UNLOAD Dec 16 12:47:05.679000 audit[4570]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.719412 kernel: audit: type=1300 audit(1765889225.679:628): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.679000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.745097 kernel: audit: type=1327 audit(1765889225.679:628): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.745198 kernel: audit: type=1334 audit(1765889225.679:629): prog-id=207 op=LOAD Dec 16 12:47:05.679000 audit: BPF prog-id=207 op=LOAD Dec 16 12:47:05.679000 audit[4570]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.767955 kernel: audit: type=1300 audit(1765889225.679:629): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.768043 kernel: audit: type=1327 audit(1765889225.679:629): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.679000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.768116 containerd[1624]: time="2025-12-16T12:47:05.766074875Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:05.770633 containerd[1624]: time="2025-12-16T12:47:05.770484485Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:47:05.679000 audit: BPF prog-id=208 op=LOAD Dec 16 12:47:05.679000 audit[4570]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.679000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.679000 audit: BPF prog-id=208 op=UNLOAD Dec 16 12:47:05.771259 containerd[1624]: time="2025-12-16T12:47:05.770947210Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:05.771306 kubelet[2807]: E1216 12:47:05.771194 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:47:05.679000 audit[4570]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.771436 kubelet[2807]: E1216 12:47:05.771348 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:47:05.679000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.679000 audit: BPF prog-id=207 op=UNLOAD Dec 16 12:47:05.679000 audit[4570]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.771571 kubelet[2807]: E1216 12:47:05.771460 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:05.679000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.679000 audit: BPF prog-id=209 op=LOAD Dec 16 12:47:05.679000 audit[4570]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4558 pid=4570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.679000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265653363623233316636313764643431633933373261646465306536 Dec 16 12:47:05.775704 containerd[1624]: time="2025-12-16T12:47:05.775665593Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:47:05.780000 audit[4595]: NETFILTER_CFG table=filter:119 family=2 entries=22 op=nft_register_rule pid=4595 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:05.784241 containerd[1624]: time="2025-12-16T12:47:05.784208200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f54fc4f9-v6lkx,Uid:50e78f30-18dc-4823-b78b-3500e19d4f6f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bee3cb231f617dd41c9372adde0e6bcf3aa778da9f8b1e02cc065cb2a2beb29c\"" Dec 16 12:47:05.780000 audit[4595]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7fff15ae1500 a2=0 a3=7fff15ae14ec items=0 ppid=2972 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.780000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:05.785000 audit[4595]: NETFILTER_CFG table=nat:120 family=2 entries=12 op=nft_register_rule pid=4595 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:05.785000 audit[4595]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff15ae1500 a2=0 a3=0 items=0 ppid=2972 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:05.785000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:05.865848 kubelet[2807]: I1216 12:47:05.864943 2807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:47:06.043713 systemd-networkd[1521]: calibcb1c768bc9: Gained IPv6LL Dec 16 12:47:06.044393 systemd-networkd[1521]: cali3093ba0761a: Gained IPv6LL Dec 16 12:47:06.212928 containerd[1624]: time="2025-12-16T12:47:06.212773126Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:06.214554 containerd[1624]: time="2025-12-16T12:47:06.214492284Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:47:06.214693 containerd[1624]: time="2025-12-16T12:47:06.214542308Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:06.214859 kubelet[2807]: E1216 12:47:06.214785 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:47:06.214859 kubelet[2807]: E1216 12:47:06.214851 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:47:06.215241 kubelet[2807]: E1216 12:47:06.215145 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:06.215241 kubelet[2807]: E1216 12:47:06.215194 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:47:06.215789 containerd[1624]: time="2025-12-16T12:47:06.215755580Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:47:06.417330 containerd[1624]: time="2025-12-16T12:47:06.417278122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df777597b-rfdvx,Uid:ae41bde4-2571-4fb7-adb8-a8808483af15,Namespace:calico-system,Attempt:0,}" Dec 16 12:47:06.424833 containerd[1624]: time="2025-12-16T12:47:06.424762308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-bqsxl,Uid:8265dde8-f635-41ae-8628-19a7296dfdf0,Namespace:kube-system,Attempt:0,}" Dec 16 12:47:06.557146 systemd-networkd[1521]: calibbaeb0aa8d0: Gained IPv6LL Dec 16 12:47:06.655505 systemd-networkd[1521]: cali90531e6cdcc: Link UP Dec 16 12:47:06.656762 systemd-networkd[1521]: cali90531e6cdcc: Gained carrier Dec 16 12:47:06.658354 containerd[1624]: time="2025-12-16T12:47:06.658113739Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:06.659540 containerd[1624]: time="2025-12-16T12:47:06.659499076Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:47:06.660743 containerd[1624]: time="2025-12-16T12:47:06.660020350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:06.661788 kubelet[2807]: E1216 12:47:06.660200 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:06.661788 kubelet[2807]: E1216 12:47:06.660243 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:06.661788 kubelet[2807]: E1216 12:47:06.660327 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-9f54fc4f9-v6lkx_calico-apiserver(50e78f30-18dc-4823-b78b-3500e19d4f6f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:06.661788 kubelet[2807]: E1216 12:47:06.660356 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:47:06.682357 kubelet[2807]: E1216 12:47:06.682149 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:47:06.685415 kubelet[2807]: E1216 12:47:06.684933 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:47:06.686711 kubelet[2807]: E1216 12:47:06.686647 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.518 [INFO][4617] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.535 [INFO][4617] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0 calico-kube-controllers-6df777597b- calico-system ae41bde4-2571-4fb7-adb8-a8808483af15 802 0 2025-12-16 12:46:39 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6df777597b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-c-452f5360ea calico-kube-controllers-6df777597b-rfdvx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali90531e6cdcc [] [] }} ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Namespace="calico-system" Pod="calico-kube-controllers-6df777597b-rfdvx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.535 [INFO][4617] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Namespace="calico-system" Pod="calico-kube-controllers-6df777597b-rfdvx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.591 [INFO][4644] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" HandleID="k8s-pod-network.1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Workload="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.591 [INFO][4644] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" HandleID="k8s-pod-network.1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Workload="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4f30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-c-452f5360ea", "pod":"calico-kube-controllers-6df777597b-rfdvx", "timestamp":"2025-12-16 12:47:06.591730976 +0000 UTC"}, Hostname:"ci-4547-0-0-c-452f5360ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.592 [INFO][4644] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.592 [INFO][4644] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.592 [INFO][4644] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-c-452f5360ea' Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.608 [INFO][4644] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.614 [INFO][4644] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.621 [INFO][4644] ipam/ipam.go 511: Trying affinity for 192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.624 [INFO][4644] ipam/ipam.go 158: Attempting to load block cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.626 [INFO][4644] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.626 [INFO][4644] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.94.0/26 handle="k8s-pod-network.1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.628 [INFO][4644] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9 Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.635 [INFO][4644] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.94.0/26 handle="k8s-pod-network.1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.642 [INFO][4644] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.94.7/26] block=192.168.94.0/26 handle="k8s-pod-network.1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.642 [INFO][4644] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.94.7/26] handle="k8s-pod-network.1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.642 [INFO][4644] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:47:06.690748 containerd[1624]: 2025-12-16 12:47:06.642 [INFO][4644] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.94.7/26] IPv6=[] ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" HandleID="k8s-pod-network.1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Workload="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" Dec 16 12:47:06.692035 containerd[1624]: 2025-12-16 12:47:06.647 [INFO][4617] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Namespace="calico-system" Pod="calico-kube-controllers-6df777597b-rfdvx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0", GenerateName:"calico-kube-controllers-6df777597b-", Namespace:"calico-system", SelfLink:"", UID:"ae41bde4-2571-4fb7-adb8-a8808483af15", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6df777597b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"", Pod:"calico-kube-controllers-6df777597b-rfdvx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.94.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali90531e6cdcc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:06.692035 containerd[1624]: 2025-12-16 12:47:06.648 [INFO][4617] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.94.7/32] ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Namespace="calico-system" Pod="calico-kube-controllers-6df777597b-rfdvx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" Dec 16 12:47:06.692035 containerd[1624]: 2025-12-16 12:47:06.648 [INFO][4617] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali90531e6cdcc ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Namespace="calico-system" Pod="calico-kube-controllers-6df777597b-rfdvx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" Dec 16 12:47:06.692035 containerd[1624]: 2025-12-16 12:47:06.658 [INFO][4617] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Namespace="calico-system" Pod="calico-kube-controllers-6df777597b-rfdvx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" Dec 16 12:47:06.692035 containerd[1624]: 2025-12-16 12:47:06.658 [INFO][4617] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Namespace="calico-system" Pod="calico-kube-controllers-6df777597b-rfdvx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0", GenerateName:"calico-kube-controllers-6df777597b-", Namespace:"calico-system", SelfLink:"", UID:"ae41bde4-2571-4fb7-adb8-a8808483af15", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6df777597b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9", Pod:"calico-kube-controllers-6df777597b-rfdvx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.94.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali90531e6cdcc", MAC:"8a:ec:1f:1d:ef:74", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:06.692035 containerd[1624]: 2025-12-16 12:47:06.684 [INFO][4617] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" Namespace="calico-system" Pod="calico-kube-controllers-6df777597b-rfdvx" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-calico--kube--controllers--6df777597b--rfdvx-eth0" Dec 16 12:47:06.735319 containerd[1624]: time="2025-12-16T12:47:06.735195540Z" level=info msg="connecting to shim 1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9" address="unix:///run/containerd/s/f2cb66cc2dd005c45a1d26ce011e42337629bb2bd6e1389bf1cf7eb2326d19c3" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:47:06.789027 systemd[1]: Started cri-containerd-1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9.scope - libcontainer container 1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9. Dec 16 12:47:06.803809 systemd-networkd[1521]: cali43903d45921: Link UP Dec 16 12:47:06.804269 systemd-networkd[1521]: cali43903d45921: Gained carrier Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.508 [INFO][4626] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.530 [INFO][4626] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0 coredns-66bc5c9577- kube-system 8265dde8-f635-41ae-8628-19a7296dfdf0 808 0 2025-12-16 12:46:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-c-452f5360ea coredns-66bc5c9577-bqsxl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali43903d45921 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Namespace="kube-system" Pod="coredns-66bc5c9577-bqsxl" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.530 [INFO][4626] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Namespace="kube-system" Pod="coredns-66bc5c9577-bqsxl" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.616 [INFO][4642] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" HandleID="k8s-pod-network.f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Workload="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.616 [INFO][4642] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" HandleID="k8s-pod-network.f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Workload="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005c3880), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-c-452f5360ea", "pod":"coredns-66bc5c9577-bqsxl", "timestamp":"2025-12-16 12:47:06.61611886 +0000 UTC"}, Hostname:"ci-4547-0-0-c-452f5360ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.616 [INFO][4642] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.642 [INFO][4642] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.642 [INFO][4642] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-c-452f5360ea' Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.708 [INFO][4642] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.740 [INFO][4642] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.752 [INFO][4642] ipam/ipam.go 511: Trying affinity for 192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.757 [INFO][4642] ipam/ipam.go 158: Attempting to load block cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.765 [INFO][4642] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.94.0/26 host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.765 [INFO][4642] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.94.0/26 handle="k8s-pod-network.f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.767 [INFO][4642] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172 Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.774 [INFO][4642] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.94.0/26 handle="k8s-pod-network.f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.787 [INFO][4642] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.94.8/26] block=192.168.94.0/26 handle="k8s-pod-network.f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.787 [INFO][4642] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.94.8/26] handle="k8s-pod-network.f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" host="ci-4547-0-0-c-452f5360ea" Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.788 [INFO][4642] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:47:06.823265 containerd[1624]: 2025-12-16 12:47:06.788 [INFO][4642] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.94.8/26] IPv6=[] ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" HandleID="k8s-pod-network.f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Workload="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" Dec 16 12:47:06.823804 containerd[1624]: 2025-12-16 12:47:06.792 [INFO][4626] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Namespace="kube-system" Pod="coredns-66bc5c9577-bqsxl" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"8265dde8-f635-41ae-8628-19a7296dfdf0", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"", Pod:"coredns-66bc5c9577-bqsxl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali43903d45921", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:06.823804 containerd[1624]: 2025-12-16 12:47:06.792 [INFO][4626] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.94.8/32] ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Namespace="kube-system" Pod="coredns-66bc5c9577-bqsxl" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" Dec 16 12:47:06.823804 containerd[1624]: 2025-12-16 12:47:06.792 [INFO][4626] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43903d45921 ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Namespace="kube-system" Pod="coredns-66bc5c9577-bqsxl" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" Dec 16 12:47:06.823804 containerd[1624]: 2025-12-16 12:47:06.804 [INFO][4626] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Namespace="kube-system" Pod="coredns-66bc5c9577-bqsxl" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" Dec 16 12:47:06.823804 containerd[1624]: 2025-12-16 12:47:06.806 [INFO][4626] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Namespace="kube-system" Pod="coredns-66bc5c9577-bqsxl" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"8265dde8-f635-41ae-8628-19a7296dfdf0", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 46, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-c-452f5360ea", ContainerID:"f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172", Pod:"coredns-66bc5c9577-bqsxl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali43903d45921", MAC:"e2:18:fe:8f:7f:1d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:47:06.823970 containerd[1624]: 2025-12-16 12:47:06.819 [INFO][4626] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" Namespace="kube-system" Pod="coredns-66bc5c9577-bqsxl" WorkloadEndpoint="ci--4547--0--0--c--452f5360ea-k8s-coredns--66bc5c9577--bqsxl-eth0" Dec 16 12:47:06.845000 audit[4707]: NETFILTER_CFG table=filter:121 family=2 entries=18 op=nft_register_rule pid=4707 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:06.845000 audit[4707]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffeb9a16660 a2=0 a3=7ffeb9a1664c items=0 ppid=2972 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.845000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:06.850827 containerd[1624]: time="2025-12-16T12:47:06.850757732Z" level=info msg="connecting to shim f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172" address="unix:///run/containerd/s/b2a63d519d27db674560c768186879b60147852280b679e6725bb6700e5a62dc" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:47:06.851000 audit: BPF prog-id=210 op=LOAD Dec 16 12:47:06.851000 audit: BPF prog-id=211 op=LOAD Dec 16 12:47:06.851000 audit[4682]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4670 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134393061373466396531383231306639626462366133316666643637 Dec 16 12:47:06.851000 audit: BPF prog-id=211 op=UNLOAD Dec 16 12:47:06.851000 audit[4682]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4670 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134393061373466396531383231306639626462366133316666643637 Dec 16 12:47:06.852000 audit: BPF prog-id=212 op=LOAD Dec 16 12:47:06.852000 audit[4682]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4670 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134393061373466396531383231306639626462366133316666643637 Dec 16 12:47:06.852000 audit: BPF prog-id=213 op=LOAD Dec 16 12:47:06.852000 audit[4682]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4670 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134393061373466396531383231306639626462366133316666643637 Dec 16 12:47:06.852000 audit: BPF prog-id=213 op=UNLOAD Dec 16 12:47:06.852000 audit[4682]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4670 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134393061373466396531383231306639626462366133316666643637 Dec 16 12:47:06.852000 audit: BPF prog-id=212 op=UNLOAD Dec 16 12:47:06.852000 audit[4682]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4670 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134393061373466396531383231306639626462366133316666643637 Dec 16 12:47:06.852000 audit: BPF prog-id=214 op=LOAD Dec 16 12:47:06.852000 audit[4682]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4670 pid=4682 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134393061373466396531383231306639626462366133316666643637 Dec 16 12:47:06.860000 audit[4707]: NETFILTER_CFG table=nat:122 family=2 entries=40 op=nft_register_chain pid=4707 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:06.860000 audit[4707]: SYSCALL arch=c000003e syscall=46 success=yes exit=17004 a0=3 a1=7ffeb9a16660 a2=0 a3=7ffeb9a1664c items=0 ppid=2972 pid=4707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.860000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:06.875905 systemd-networkd[1521]: cali1854643f905: Gained IPv6LL Dec 16 12:47:06.884778 systemd[1]: Started cri-containerd-f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172.scope - libcontainer container f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172. Dec 16 12:47:06.896000 audit: BPF prog-id=215 op=LOAD Dec 16 12:47:06.897000 audit: BPF prog-id=216 op=LOAD Dec 16 12:47:06.897000 audit[4731]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4720 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634663166633366623537383334623534396530386535326231666264 Dec 16 12:47:06.897000 audit: BPF prog-id=216 op=UNLOAD Dec 16 12:47:06.897000 audit[4731]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4720 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634663166633366623537383334623534396530386535326231666264 Dec 16 12:47:06.897000 audit: BPF prog-id=217 op=LOAD Dec 16 12:47:06.897000 audit[4731]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4720 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634663166633366623537383334623534396530386535326231666264 Dec 16 12:47:06.897000 audit: BPF prog-id=218 op=LOAD Dec 16 12:47:06.897000 audit[4731]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4720 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634663166633366623537383334623534396530386535326231666264 Dec 16 12:47:06.897000 audit: BPF prog-id=218 op=UNLOAD Dec 16 12:47:06.897000 audit[4731]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4720 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634663166633366623537383334623534396530386535326231666264 Dec 16 12:47:06.897000 audit: BPF prog-id=217 op=UNLOAD Dec 16 12:47:06.897000 audit[4731]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4720 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634663166633366623537383334623534396530386535326231666264 Dec 16 12:47:06.897000 audit: BPF prog-id=219 op=LOAD Dec 16 12:47:06.897000 audit[4731]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4720 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:06.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634663166633366623537383334623534396530386535326231666264 Dec 16 12:47:06.931425 containerd[1624]: time="2025-12-16T12:47:06.931324305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-bqsxl,Uid:8265dde8-f635-41ae-8628-19a7296dfdf0,Namespace:kube-system,Attempt:0,} returns sandbox id \"f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172\"" Dec 16 12:47:06.939012 containerd[1624]: time="2025-12-16T12:47:06.938939618Z" level=info msg="CreateContainer within sandbox \"f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:47:06.955717 containerd[1624]: time="2025-12-16T12:47:06.955675068Z" level=info msg="Container d98e55198bd4544cd11a5bc2354eb7a326498809df81bbea1125e88d490ea904: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:47:06.963978 containerd[1624]: time="2025-12-16T12:47:06.963888121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6df777597b-rfdvx,Uid:ae41bde4-2571-4fb7-adb8-a8808483af15,Namespace:calico-system,Attempt:0,} returns sandbox id \"1490a74f9e18210f9bdb6a31ffd676ac3598e70a663590c7256e447cb59394e9\"" Dec 16 12:47:06.966116 containerd[1624]: time="2025-12-16T12:47:06.966060183Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:47:06.966425 containerd[1624]: time="2025-12-16T12:47:06.966409183Z" level=info msg="CreateContainer within sandbox \"f4f1fc3fb57834b549e08e52b1fbd28aa8ae0d5fdd963434c9a47e2c48d3d172\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d98e55198bd4544cd11a5bc2354eb7a326498809df81bbea1125e88d490ea904\"" Dec 16 12:47:06.968065 containerd[1624]: time="2025-12-16T12:47:06.967146414Z" level=info msg="StartContainer for \"d98e55198bd4544cd11a5bc2354eb7a326498809df81bbea1125e88d490ea904\"" Dec 16 12:47:06.968065 containerd[1624]: time="2025-12-16T12:47:06.967829485Z" level=info msg="connecting to shim d98e55198bd4544cd11a5bc2354eb7a326498809df81bbea1125e88d490ea904" address="unix:///run/containerd/s/b2a63d519d27db674560c768186879b60147852280b679e6725bb6700e5a62dc" protocol=ttrpc version=3 Dec 16 12:47:06.989716 systemd[1]: Started cri-containerd-d98e55198bd4544cd11a5bc2354eb7a326498809df81bbea1125e88d490ea904.scope - libcontainer container d98e55198bd4544cd11a5bc2354eb7a326498809df81bbea1125e88d490ea904. Dec 16 12:47:07.010000 audit: BPF prog-id=220 op=LOAD Dec 16 12:47:07.011000 audit: BPF prog-id=221 op=LOAD Dec 16 12:47:07.011000 audit[4769]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4720 pid=4769 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439386535353139386264343534346364313161356263323335346562 Dec 16 12:47:07.012000 audit: BPF prog-id=221 op=UNLOAD Dec 16 12:47:07.012000 audit[4769]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4720 pid=4769 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.012000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439386535353139386264343534346364313161356263323335346562 Dec 16 12:47:07.013000 audit: BPF prog-id=222 op=LOAD Dec 16 12:47:07.013000 audit[4769]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4720 pid=4769 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.013000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439386535353139386264343534346364313161356263323335346562 Dec 16 12:47:07.013000 audit: BPF prog-id=223 op=LOAD Dec 16 12:47:07.013000 audit[4769]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4720 pid=4769 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.013000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439386535353139386264343534346364313161356263323335346562 Dec 16 12:47:07.013000 audit: BPF prog-id=223 op=UNLOAD Dec 16 12:47:07.013000 audit[4769]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4720 pid=4769 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.013000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439386535353139386264343534346364313161356263323335346562 Dec 16 12:47:07.013000 audit: BPF prog-id=222 op=UNLOAD Dec 16 12:47:07.013000 audit[4769]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4720 pid=4769 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.013000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439386535353139386264343534346364313161356263323335346562 Dec 16 12:47:07.013000 audit: BPF prog-id=224 op=LOAD Dec 16 12:47:07.013000 audit[4769]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4720 pid=4769 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.013000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439386535353139386264343534346364313161356263323335346562 Dec 16 12:47:07.096471 containerd[1624]: time="2025-12-16T12:47:07.096366996Z" level=info msg="StartContainer for \"d98e55198bd4544cd11a5bc2354eb7a326498809df81bbea1125e88d490ea904\" returns successfully" Dec 16 12:47:07.139000 audit: BPF prog-id=225 op=LOAD Dec 16 12:47:07.139000 audit[4830]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff7ed9f7a0 a2=98 a3=1fffffffffffffff items=0 ppid=4742 pid=4830 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.139000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:47:07.139000 audit: BPF prog-id=225 op=UNLOAD Dec 16 12:47:07.139000 audit[4830]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff7ed9f770 a3=0 items=0 ppid=4742 pid=4830 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.139000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:47:07.139000 audit: BPF prog-id=226 op=LOAD Dec 16 12:47:07.139000 audit[4830]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff7ed9f680 a2=94 a3=3 items=0 ppid=4742 pid=4830 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.139000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:47:07.140000 audit: BPF prog-id=226 op=UNLOAD Dec 16 12:47:07.140000 audit[4830]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff7ed9f680 a2=94 a3=3 items=0 ppid=4742 pid=4830 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.140000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:47:07.140000 audit: BPF prog-id=227 op=LOAD Dec 16 12:47:07.140000 audit[4830]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff7ed9f6c0 a2=94 a3=7fff7ed9f8a0 items=0 ppid=4742 pid=4830 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.140000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:47:07.140000 audit: BPF prog-id=227 op=UNLOAD Dec 16 12:47:07.140000 audit[4830]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff7ed9f6c0 a2=94 a3=7fff7ed9f8a0 items=0 ppid=4742 pid=4830 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.140000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:47:07.142000 audit: BPF prog-id=228 op=LOAD Dec 16 12:47:07.142000 audit[4831]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd5e53da10 a2=98 a3=3 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.142000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.143000 audit: BPF prog-id=228 op=UNLOAD Dec 16 12:47:07.143000 audit[4831]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd5e53d9e0 a3=0 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.143000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.143000 audit: BPF prog-id=229 op=LOAD Dec 16 12:47:07.143000 audit[4831]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd5e53d800 a2=94 a3=54428f items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.143000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.143000 audit: BPF prog-id=229 op=UNLOAD Dec 16 12:47:07.143000 audit[4831]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd5e53d800 a2=94 a3=54428f items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.143000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.143000 audit: BPF prog-id=230 op=LOAD Dec 16 12:47:07.143000 audit[4831]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd5e53d830 a2=94 a3=2 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.143000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.143000 audit: BPF prog-id=230 op=UNLOAD Dec 16 12:47:07.143000 audit[4831]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd5e53d830 a2=0 a3=2 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.143000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.374000 audit: BPF prog-id=231 op=LOAD Dec 16 12:47:07.374000 audit[4831]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd5e53d6f0 a2=94 a3=1 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.374000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.374000 audit: BPF prog-id=231 op=UNLOAD Dec 16 12:47:07.374000 audit[4831]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd5e53d6f0 a2=94 a3=1 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.374000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.384000 audit: BPF prog-id=232 op=LOAD Dec 16 12:47:07.384000 audit[4831]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd5e53d6e0 a2=94 a3=4 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.384000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.384000 audit: BPF prog-id=232 op=UNLOAD Dec 16 12:47:07.384000 audit[4831]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd5e53d6e0 a2=0 a3=4 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.384000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.385000 audit: BPF prog-id=233 op=LOAD Dec 16 12:47:07.385000 audit[4831]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd5e53d540 a2=94 a3=5 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.385000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.385000 audit: BPF prog-id=233 op=UNLOAD Dec 16 12:47:07.385000 audit[4831]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd5e53d540 a2=0 a3=5 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.385000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.385000 audit: BPF prog-id=234 op=LOAD Dec 16 12:47:07.385000 audit[4831]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd5e53d760 a2=94 a3=6 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.385000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.385000 audit: BPF prog-id=234 op=UNLOAD Dec 16 12:47:07.385000 audit[4831]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd5e53d760 a2=0 a3=6 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.385000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.386000 audit: BPF prog-id=235 op=LOAD Dec 16 12:47:07.386000 audit[4831]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd5e53cf10 a2=94 a3=88 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.386000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.386000 audit: BPF prog-id=236 op=LOAD Dec 16 12:47:07.386000 audit[4831]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffd5e53cd90 a2=94 a3=2 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.386000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.386000 audit: BPF prog-id=236 op=UNLOAD Dec 16 12:47:07.386000 audit[4831]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffd5e53cdc0 a2=0 a3=7ffd5e53cec0 items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.386000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.387000 audit: BPF prog-id=235 op=UNLOAD Dec 16 12:47:07.387000 audit[4831]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=a838d10 a2=0 a3=b75f4633d9965c4d items=0 ppid=4742 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.387000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:47:07.393874 containerd[1624]: time="2025-12-16T12:47:07.393836853Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:07.395486 containerd[1624]: time="2025-12-16T12:47:07.395405715Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:47:07.395687 containerd[1624]: time="2025-12-16T12:47:07.395429680Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:07.396112 kubelet[2807]: E1216 12:47:07.396052 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:47:07.396112 kubelet[2807]: E1216 12:47:07.396109 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:47:07.396563 kubelet[2807]: E1216 12:47:07.396232 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6df777597b-rfdvx_calico-system(ae41bde4-2571-4fb7-adb8-a8808483af15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:07.396653 kubelet[2807]: E1216 12:47:07.396516 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:47:07.414000 audit: BPF prog-id=237 op=LOAD Dec 16 12:47:07.414000 audit[4858]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffca7a7f940 a2=98 a3=1999999999999999 items=0 ppid=4742 pid=4858 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.414000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:47:07.414000 audit: BPF prog-id=237 op=UNLOAD Dec 16 12:47:07.414000 audit[4858]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffca7a7f910 a3=0 items=0 ppid=4742 pid=4858 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.414000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:47:07.414000 audit: BPF prog-id=238 op=LOAD Dec 16 12:47:07.414000 audit[4858]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffca7a7f820 a2=94 a3=ffff items=0 ppid=4742 pid=4858 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.414000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:47:07.414000 audit: BPF prog-id=238 op=UNLOAD Dec 16 12:47:07.414000 audit[4858]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffca7a7f820 a2=94 a3=ffff items=0 ppid=4742 pid=4858 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.414000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:47:07.414000 audit: BPF prog-id=239 op=LOAD Dec 16 12:47:07.414000 audit[4858]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffca7a7f860 a2=94 a3=7ffca7a7fa40 items=0 ppid=4742 pid=4858 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.414000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:47:07.415000 audit: BPF prog-id=239 op=UNLOAD Dec 16 12:47:07.415000 audit[4858]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffca7a7f860 a2=94 a3=7ffca7a7fa40 items=0 ppid=4742 pid=4858 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.415000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:47:07.516012 systemd-networkd[1521]: vxlan.calico: Link UP Dec 16 12:47:07.516020 systemd-networkd[1521]: vxlan.calico: Gained carrier Dec 16 12:47:07.574000 audit: BPF prog-id=240 op=LOAD Dec 16 12:47:07.574000 audit[4885]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc0646fa80 a2=98 a3=0 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.574000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.575000 audit: BPF prog-id=240 op=UNLOAD Dec 16 12:47:07.575000 audit[4885]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffc0646fa50 a3=0 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.575000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.575000 audit: BPF prog-id=241 op=LOAD Dec 16 12:47:07.575000 audit[4885]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc0646f890 a2=94 a3=54428f items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.575000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.575000 audit: BPF prog-id=241 op=UNLOAD Dec 16 12:47:07.575000 audit[4885]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc0646f890 a2=94 a3=54428f items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.575000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.575000 audit: BPF prog-id=242 op=LOAD Dec 16 12:47:07.575000 audit[4885]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc0646f8c0 a2=94 a3=2 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.575000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.575000 audit: BPF prog-id=242 op=UNLOAD Dec 16 12:47:07.575000 audit[4885]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc0646f8c0 a2=0 a3=2 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.575000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.575000 audit: BPF prog-id=243 op=LOAD Dec 16 12:47:07.575000 audit[4885]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc0646f670 a2=94 a3=4 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.575000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.575000 audit: BPF prog-id=243 op=UNLOAD Dec 16 12:47:07.575000 audit[4885]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffc0646f670 a2=94 a3=4 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.575000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.575000 audit: BPF prog-id=244 op=LOAD Dec 16 12:47:07.575000 audit[4885]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc0646f770 a2=94 a3=7ffc0646f8f0 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.575000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.575000 audit: BPF prog-id=244 op=UNLOAD Dec 16 12:47:07.575000 audit[4885]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffc0646f770 a2=0 a3=7ffc0646f8f0 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.575000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.576000 audit: BPF prog-id=245 op=LOAD Dec 16 12:47:07.576000 audit[4885]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc0646eea0 a2=94 a3=2 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.576000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.576000 audit: BPF prog-id=245 op=UNLOAD Dec 16 12:47:07.576000 audit[4885]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffc0646eea0 a2=0 a3=2 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.576000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.576000 audit: BPF prog-id=246 op=LOAD Dec 16 12:47:07.576000 audit[4885]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc0646efa0 a2=94 a3=30 items=0 ppid=4742 pid=4885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.576000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:47:07.581000 audit: BPF prog-id=247 op=LOAD Dec 16 12:47:07.581000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdea115820 a2=98 a3=0 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.581000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.581000 audit: BPF prog-id=247 op=UNLOAD Dec 16 12:47:07.581000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffdea1157f0 a3=0 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.581000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.582000 audit: BPF prog-id=248 op=LOAD Dec 16 12:47:07.582000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffdea115610 a2=94 a3=54428f items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.582000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.582000 audit: BPF prog-id=248 op=UNLOAD Dec 16 12:47:07.582000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffdea115610 a2=94 a3=54428f items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.582000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.582000 audit: BPF prog-id=249 op=LOAD Dec 16 12:47:07.582000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffdea115640 a2=94 a3=2 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.582000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.582000 audit: BPF prog-id=249 op=UNLOAD Dec 16 12:47:07.582000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffdea115640 a2=0 a3=2 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.582000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.693097 kubelet[2807]: E1216 12:47:07.692950 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:47:07.695082 kubelet[2807]: E1216 12:47:07.695044 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:47:07.724670 kubelet[2807]: I1216 12:47:07.724583 2807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-bqsxl" podStartSLOduration=44.724567783 podStartE2EDuration="44.724567783s" podCreationTimestamp="2025-12-16 12:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:47:07.704118475 +0000 UTC m=+51.454377468" watchObservedRunningTime="2025-12-16 12:47:07.724567783 +0000 UTC m=+51.474826775" Dec 16 12:47:07.808000 audit: BPF prog-id=250 op=LOAD Dec 16 12:47:07.808000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffdea115500 a2=94 a3=1 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.808000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.808000 audit: BPF prog-id=250 op=UNLOAD Dec 16 12:47:07.808000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffdea115500 a2=94 a3=1 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.808000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.818000 audit: BPF prog-id=251 op=LOAD Dec 16 12:47:07.818000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffdea1154f0 a2=94 a3=4 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.818000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.818000 audit: BPF prog-id=251 op=UNLOAD Dec 16 12:47:07.818000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffdea1154f0 a2=0 a3=4 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.818000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.818000 audit: BPF prog-id=252 op=LOAD Dec 16 12:47:07.818000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffdea115350 a2=94 a3=5 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.818000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.818000 audit: BPF prog-id=252 op=UNLOAD Dec 16 12:47:07.818000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffdea115350 a2=0 a3=5 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.818000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.818000 audit: BPF prog-id=253 op=LOAD Dec 16 12:47:07.818000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffdea115570 a2=94 a3=6 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.818000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.818000 audit: BPF prog-id=253 op=UNLOAD Dec 16 12:47:07.818000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffdea115570 a2=0 a3=6 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.818000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.819000 audit: BPF prog-id=254 op=LOAD Dec 16 12:47:07.819000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffdea114d20 a2=94 a3=88 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.819000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.819000 audit: BPF prog-id=255 op=LOAD Dec 16 12:47:07.819000 audit[4887]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffdea114ba0 a2=94 a3=2 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.819000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.819000 audit: BPF prog-id=255 op=UNLOAD Dec 16 12:47:07.819000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffdea114bd0 a2=0 a3=7ffdea114cd0 items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.819000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.819000 audit: BPF prog-id=254 op=UNLOAD Dec 16 12:47:07.819000 audit[4887]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=252c1d10 a2=0 a3=56777e69463892eb items=0 ppid=4742 pid=4887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.819000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:47:07.826000 audit: BPF prog-id=246 op=UNLOAD Dec 16 12:47:07.826000 audit[4742]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c00004d7c0 a2=0 a3=0 items=0 ppid=3987 pid=4742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.826000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Dec 16 12:47:07.900768 systemd-networkd[1521]: cali90531e6cdcc: Gained IPv6LL Dec 16 12:47:07.899000 audit[4913]: NETFILTER_CFG table=filter:123 family=2 entries=14 op=nft_register_rule pid=4913 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:07.899000 audit[4913]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffcac4fb590 a2=0 a3=7ffcac4fb57c items=0 ppid=2972 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.899000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:07.914000 audit[4913]: NETFILTER_CFG table=nat:124 family=2 entries=56 op=nft_register_chain pid=4913 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:47:07.914000 audit[4913]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffcac4fb590 a2=0 a3=7ffcac4fb57c items=0 ppid=2972 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.914000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:47:07.932000 audit[4921]: NETFILTER_CFG table=nat:125 family=2 entries=15 op=nft_register_chain pid=4921 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:47:07.932000 audit[4921]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffc714d8b90 a2=0 a3=7ffc714d8b7c items=0 ppid=4742 pid=4921 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.932000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:47:07.935000 audit[4922]: NETFILTER_CFG table=mangle:126 family=2 entries=16 op=nft_register_chain pid=4922 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:47:07.935000 audit[4922]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffd2339ce30 a2=0 a3=7ffd2339ce1c items=0 ppid=4742 pid=4922 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.935000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:47:07.940000 audit[4919]: NETFILTER_CFG table=raw:127 family=2 entries=21 op=nft_register_chain pid=4919 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:47:07.940000 audit[4919]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7fff424d4a30 a2=0 a3=7fff424d4a1c items=0 ppid=4742 pid=4919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.940000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:47:07.951000 audit[4926]: NETFILTER_CFG table=filter:128 family=2 entries=327 op=nft_register_chain pid=4926 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:47:07.951000 audit[4926]: SYSCALL arch=c000003e syscall=46 success=yes exit=193468 a0=3 a1=7fff964c3f30 a2=0 a3=7fff964c3f1c items=0 ppid=4742 pid=4926 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:07.951000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:47:08.411780 systemd-networkd[1521]: cali43903d45921: Gained IPv6LL Dec 16 12:47:08.667773 systemd-networkd[1521]: vxlan.calico: Gained IPv6LL Dec 16 12:47:08.692158 kubelet[2807]: E1216 12:47:08.692127 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:47:17.414690 containerd[1624]: time="2025-12-16T12:47:17.414615169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:47:17.841316 containerd[1624]: time="2025-12-16T12:47:17.841266229Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:17.843246 containerd[1624]: time="2025-12-16T12:47:17.843189637Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:47:17.843724 containerd[1624]: time="2025-12-16T12:47:17.843268727Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:17.843768 kubelet[2807]: E1216 12:47:17.843421 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:47:17.843768 kubelet[2807]: E1216 12:47:17.843468 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:47:17.843768 kubelet[2807]: E1216 12:47:17.843579 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-78c5bf4bd4-c4c2s_calico-system(373b77f0-3f56-4579-96a0-a033d280b187): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:17.845080 containerd[1624]: time="2025-12-16T12:47:17.845054715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:47:18.285884 containerd[1624]: time="2025-12-16T12:47:18.285463483Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:18.287647 containerd[1624]: time="2025-12-16T12:47:18.287596527Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:47:18.287746 containerd[1624]: time="2025-12-16T12:47:18.287707967Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:18.288124 kubelet[2807]: E1216 12:47:18.288060 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:47:18.288260 kubelet[2807]: E1216 12:47:18.288132 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:47:18.288260 kubelet[2807]: E1216 12:47:18.288229 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-78c5bf4bd4-c4c2s_calico-system(373b77f0-3f56-4579-96a0-a033d280b187): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:18.288349 kubelet[2807]: E1216 12:47:18.288285 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:47:18.417734 containerd[1624]: time="2025-12-16T12:47:18.416488452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:47:18.844481 containerd[1624]: time="2025-12-16T12:47:18.844415375Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:18.845960 containerd[1624]: time="2025-12-16T12:47:18.845915214Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:47:18.846135 containerd[1624]: time="2025-12-16T12:47:18.845932316Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:18.846237 kubelet[2807]: E1216 12:47:18.846147 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:18.846237 kubelet[2807]: E1216 12:47:18.846195 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:18.847856 kubelet[2807]: E1216 12:47:18.846372 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-9f54fc4f9-v6lkx_calico-apiserver(50e78f30-18dc-4823-b78b-3500e19d4f6f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:18.847856 kubelet[2807]: E1216 12:47:18.846407 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:47:18.848386 containerd[1624]: time="2025-12-16T12:47:18.848124250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:47:19.288011 containerd[1624]: time="2025-12-16T12:47:19.287774797Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:19.289910 containerd[1624]: time="2025-12-16T12:47:19.289795710Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:47:19.289910 containerd[1624]: time="2025-12-16T12:47:19.289858668Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:19.290524 kubelet[2807]: E1216 12:47:19.290089 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:47:19.290524 kubelet[2807]: E1216 12:47:19.290146 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:47:19.290524 kubelet[2807]: E1216 12:47:19.290246 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:19.292783 containerd[1624]: time="2025-12-16T12:47:19.292749551Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:47:19.732041 containerd[1624]: time="2025-12-16T12:47:19.731934246Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:19.733939 containerd[1624]: time="2025-12-16T12:47:19.733683075Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:47:19.733939 containerd[1624]: time="2025-12-16T12:47:19.733729302Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:19.734176 kubelet[2807]: E1216 12:47:19.734083 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:47:19.734176 kubelet[2807]: E1216 12:47:19.734142 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:47:19.734405 kubelet[2807]: E1216 12:47:19.734364 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:19.734620 kubelet[2807]: E1216 12:47:19.734436 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:47:19.735166 containerd[1624]: time="2025-12-16T12:47:19.734799571Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:47:20.193818 containerd[1624]: time="2025-12-16T12:47:20.193712992Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:20.195567 containerd[1624]: time="2025-12-16T12:47:20.195454225Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:47:20.195567 containerd[1624]: time="2025-12-16T12:47:20.195495363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:20.195999 kubelet[2807]: E1216 12:47:20.195912 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:20.195999 kubelet[2807]: E1216 12:47:20.195980 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:20.196474 kubelet[2807]: E1216 12:47:20.196091 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-9f54fc4f9-7wdcc_calico-apiserver(1081c7e3-0645-4858-ac62-90f635bb19a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:20.196474 kubelet[2807]: E1216 12:47:20.196135 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:47:20.415022 containerd[1624]: time="2025-12-16T12:47:20.413522155Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:47:20.837249 containerd[1624]: time="2025-12-16T12:47:20.837141530Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:20.838895 containerd[1624]: time="2025-12-16T12:47:20.838808644Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:47:20.838992 containerd[1624]: time="2025-12-16T12:47:20.838937667Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:20.848297 kubelet[2807]: E1216 12:47:20.848216 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:47:20.848297 kubelet[2807]: E1216 12:47:20.848272 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:47:20.848598 kubelet[2807]: E1216 12:47:20.848348 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sdw6q_calico-system(db9e60f0-9cc3-4000-b32c-9ba313d4b676): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:20.848598 kubelet[2807]: E1216 12:47:20.848380 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:47:23.413081 containerd[1624]: time="2025-12-16T12:47:23.413049452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:47:24.077038 containerd[1624]: time="2025-12-16T12:47:24.076927000Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:24.078927 containerd[1624]: time="2025-12-16T12:47:24.078823806Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:47:24.079212 containerd[1624]: time="2025-12-16T12:47:24.079101610Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:24.079327 kubelet[2807]: E1216 12:47:24.079137 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:47:24.079327 kubelet[2807]: E1216 12:47:24.079197 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:47:24.079327 kubelet[2807]: E1216 12:47:24.079294 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6df777597b-rfdvx_calico-system(ae41bde4-2571-4fb7-adb8-a8808483af15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:24.081064 kubelet[2807]: E1216 12:47:24.079339 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:47:31.414385 kubelet[2807]: E1216 12:47:31.414299 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:47:31.415416 kubelet[2807]: E1216 12:47:31.415236 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:47:32.413998 kubelet[2807]: E1216 12:47:32.413747 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:47:32.416629 kubelet[2807]: E1216 12:47:32.416347 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:47:33.414219 kubelet[2807]: E1216 12:47:33.414023 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:47:36.413691 kubelet[2807]: E1216 12:47:36.413574 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:47:44.416954 containerd[1624]: time="2025-12-16T12:47:44.416886532Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:47:44.860627 containerd[1624]: time="2025-12-16T12:47:44.860502815Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:44.862195 containerd[1624]: time="2025-12-16T12:47:44.862092729Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:47:44.862414 containerd[1624]: time="2025-12-16T12:47:44.862126975Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:44.862645 kubelet[2807]: E1216 12:47:44.862507 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:44.862645 kubelet[2807]: E1216 12:47:44.862642 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:44.863740 kubelet[2807]: E1216 12:47:44.862936 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-9f54fc4f9-7wdcc_calico-apiserver(1081c7e3-0645-4858-ac62-90f635bb19a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:44.863740 kubelet[2807]: E1216 12:47:44.862970 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:47:44.864564 containerd[1624]: time="2025-12-16T12:47:44.863523294Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:47:45.297133 containerd[1624]: time="2025-12-16T12:47:45.296922733Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:45.298240 containerd[1624]: time="2025-12-16T12:47:45.298144110Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:47:45.298240 containerd[1624]: time="2025-12-16T12:47:45.298213121Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:45.298459 kubelet[2807]: E1216 12:47:45.298401 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:47:45.298459 kubelet[2807]: E1216 12:47:45.298452 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:47:45.298644 kubelet[2807]: E1216 12:47:45.298614 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:45.299458 containerd[1624]: time="2025-12-16T12:47:45.299214814Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:47:45.715081 containerd[1624]: time="2025-12-16T12:47:45.715014754Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:45.716552 containerd[1624]: time="2025-12-16T12:47:45.716431573Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:47:45.716552 containerd[1624]: time="2025-12-16T12:47:45.716506765Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:45.718698 kubelet[2807]: E1216 12:47:45.718625 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:45.718761 kubelet[2807]: E1216 12:47:45.718704 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:47:45.718875 kubelet[2807]: E1216 12:47:45.718851 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-9f54fc4f9-v6lkx_calico-apiserver(50e78f30-18dc-4823-b78b-3500e19d4f6f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:45.718921 kubelet[2807]: E1216 12:47:45.718895 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:47:45.719454 containerd[1624]: time="2025-12-16T12:47:45.719423096Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:47:46.148973 containerd[1624]: time="2025-12-16T12:47:46.148890341Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:46.150465 containerd[1624]: time="2025-12-16T12:47:46.150393852Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:47:46.151141 kubelet[2807]: E1216 12:47:46.151058 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:47:46.151723 kubelet[2807]: E1216 12:47:46.151235 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:47:46.154432 kubelet[2807]: E1216 12:47:46.153392 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sdw6q_calico-system(db9e60f0-9cc3-4000-b32c-9ba313d4b676): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:46.154432 kubelet[2807]: E1216 12:47:46.153457 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:47:46.158726 containerd[1624]: time="2025-12-16T12:47:46.150507837Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:46.158784 containerd[1624]: time="2025-12-16T12:47:46.153262523Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:47:46.598627 containerd[1624]: time="2025-12-16T12:47:46.598548537Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:46.599773 containerd[1624]: time="2025-12-16T12:47:46.599729378Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:47:46.599917 containerd[1624]: time="2025-12-16T12:47:46.599812155Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:46.600713 kubelet[2807]: E1216 12:47:46.600656 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:47:46.600785 kubelet[2807]: E1216 12:47:46.600744 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:47:46.601340 containerd[1624]: time="2025-12-16T12:47:46.601270421Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:47:46.601658 kubelet[2807]: E1216 12:47:46.601582 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:46.601835 kubelet[2807]: E1216 12:47:46.601654 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:47:47.055631 containerd[1624]: time="2025-12-16T12:47:47.055595226Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:47.057073 containerd[1624]: time="2025-12-16T12:47:47.057023765Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:47:47.057191 containerd[1624]: time="2025-12-16T12:47:47.057126419Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:47.057388 kubelet[2807]: E1216 12:47:47.057353 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:47:47.057435 kubelet[2807]: E1216 12:47:47.057399 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:47:47.057501 kubelet[2807]: E1216 12:47:47.057480 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-78c5bf4bd4-c4c2s_calico-system(373b77f0-3f56-4579-96a0-a033d280b187): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:47.059545 containerd[1624]: time="2025-12-16T12:47:47.058921401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:47:47.494892 containerd[1624]: time="2025-12-16T12:47:47.494780821Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:47.496124 containerd[1624]: time="2025-12-16T12:47:47.496088211Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:47:47.496215 containerd[1624]: time="2025-12-16T12:47:47.496179964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:47.496383 kubelet[2807]: E1216 12:47:47.496350 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:47:47.496642 kubelet[2807]: E1216 12:47:47.496395 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:47:47.496642 kubelet[2807]: E1216 12:47:47.496559 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-78c5bf4bd4-c4c2s_calico-system(373b77f0-3f56-4579-96a0-a033d280b187): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:47.496642 kubelet[2807]: E1216 12:47:47.496598 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:47:47.498221 containerd[1624]: time="2025-12-16T12:47:47.497844911Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:47:47.913659 containerd[1624]: time="2025-12-16T12:47:47.913596339Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:47:47.915745 containerd[1624]: time="2025-12-16T12:47:47.915690806Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:47:47.915745 containerd[1624]: time="2025-12-16T12:47:47.915718489Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:47:47.916011 kubelet[2807]: E1216 12:47:47.915968 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:47:47.916079 kubelet[2807]: E1216 12:47:47.916016 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:47:47.916109 kubelet[2807]: E1216 12:47:47.916088 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6df777597b-rfdvx_calico-system(ae41bde4-2571-4fb7-adb8-a8808483af15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:47:47.916138 kubelet[2807]: E1216 12:47:47.916118 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:47:53.001653 kernel: kauditd_printk_skb: 294 callbacks suppressed Dec 16 12:47:53.002506 kernel: audit: type=1130 audit(1765889272.997:730): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.23.34:22-147.75.109.163:49676 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:47:52.997000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.23.34:22-147.75.109.163:49676 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:47:52.997681 systemd[1]: Started sshd@7-77.42.23.34:22-147.75.109.163:49676.service - OpenSSH per-connection server daemon (147.75.109.163:49676). Dec 16 12:47:53.904000 audit[5006]: USER_ACCT pid=5006 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:53.911330 sshd[5006]: Accepted publickey for core from 147.75.109.163 port 49676 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:47:53.913160 kernel: audit: type=1101 audit(1765889273.904:731): pid=5006 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:53.915863 sshd-session[5006]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:47:53.912000 audit[5006]: CRED_ACQ pid=5006 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:53.925911 kernel: audit: type=1103 audit(1765889273.912:732): pid=5006 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:53.925987 kernel: audit: type=1006 audit(1765889273.912:733): pid=5006 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Dec 16 12:47:53.932781 kernel: audit: type=1300 audit(1765889273.912:733): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc14f29830 a2=3 a3=0 items=0 ppid=1 pid=5006 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:53.912000 audit[5006]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc14f29830 a2=3 a3=0 items=0 ppid=1 pid=5006 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:47:53.912000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:47:53.936547 kernel: audit: type=1327 audit(1765889273.912:733): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:47:53.939674 systemd-logind[1607]: New session 9 of user core. Dec 16 12:47:53.944694 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 12:47:53.948000 audit[5006]: USER_START pid=5006 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:53.949000 audit[5010]: CRED_ACQ pid=5010 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:53.956593 kernel: audit: type=1105 audit(1765889273.948:734): pid=5006 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:53.956654 kernel: audit: type=1103 audit(1765889273.949:735): pid=5010 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:54.390247 update_engine[1609]: I20251216 12:47:54.390089 1609 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Dec 16 12:47:54.390247 update_engine[1609]: I20251216 12:47:54.390143 1609 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Dec 16 12:47:54.394981 update_engine[1609]: I20251216 12:47:54.394601 1609 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Dec 16 12:47:54.394981 update_engine[1609]: I20251216 12:47:54.394894 1609 omaha_request_params.cc:62] Current group set to alpha Dec 16 12:47:54.396086 update_engine[1609]: I20251216 12:47:54.395591 1609 update_attempter.cc:499] Already updated boot flags. Skipping. Dec 16 12:47:54.397187 update_engine[1609]: I20251216 12:47:54.396493 1609 update_attempter.cc:643] Scheduling an action processor start. Dec 16 12:47:54.397187 update_engine[1609]: I20251216 12:47:54.396522 1609 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 16 12:47:54.397187 update_engine[1609]: I20251216 12:47:54.396591 1609 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Dec 16 12:47:54.397187 update_engine[1609]: I20251216 12:47:54.396631 1609 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 16 12:47:54.397187 update_engine[1609]: I20251216 12:47:54.396638 1609 omaha_request_action.cc:272] Request: Dec 16 12:47:54.397187 update_engine[1609]: Dec 16 12:47:54.397187 update_engine[1609]: Dec 16 12:47:54.397187 update_engine[1609]: Dec 16 12:47:54.397187 update_engine[1609]: Dec 16 12:47:54.397187 update_engine[1609]: Dec 16 12:47:54.397187 update_engine[1609]: Dec 16 12:47:54.397187 update_engine[1609]: Dec 16 12:47:54.397187 update_engine[1609]: Dec 16 12:47:54.397187 update_engine[1609]: I20251216 12:47:54.396643 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:47:54.408561 update_engine[1609]: I20251216 12:47:54.408507 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:47:54.409834 update_engine[1609]: I20251216 12:47:54.409804 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:47:54.415768 update_engine[1609]: E20251216 12:47:54.415663 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:47:54.415768 update_engine[1609]: I20251216 12:47:54.415746 1609 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Dec 16 12:47:54.423642 locksmithd[1661]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Dec 16 12:47:55.065386 sshd[5010]: Connection closed by 147.75.109.163 port 49676 Dec 16 12:47:55.066009 sshd-session[5006]: pam_unix(sshd:session): session closed for user core Dec 16 12:47:55.088921 kernel: audit: type=1106 audit(1765889275.078:736): pid=5006 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:55.078000 audit[5006]: USER_END pid=5006 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:55.087130 systemd[1]: sshd@7-77.42.23.34:22-147.75.109.163:49676.service: Deactivated successfully. Dec 16 12:47:55.090183 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 12:47:55.078000 audit[5006]: CRED_DISP pid=5006 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:55.099377 systemd-logind[1607]: Session 9 logged out. Waiting for processes to exit. Dec 16 12:47:55.106654 kernel: audit: type=1104 audit(1765889275.078:737): pid=5006 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:47:55.107661 systemd-logind[1607]: Removed session 9. Dec 16 12:47:55.086000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.23.34:22-147.75.109.163:49676 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:47:55.415841 kubelet[2807]: E1216 12:47:55.414961 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:47:58.416567 kubelet[2807]: E1216 12:47:58.415792 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:47:58.418621 kubelet[2807]: E1216 12:47:58.417877 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:48:00.235766 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:48:00.235856 kernel: audit: type=1130 audit(1765889280.232:739): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.23.34:22-147.75.109.163:49690 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:00.232000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.23.34:22-147.75.109.163:49690 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:00.232447 systemd[1]: Started sshd@8-77.42.23.34:22-147.75.109.163:49690.service - OpenSSH per-connection server daemon (147.75.109.163:49690). Dec 16 12:48:00.417070 kubelet[2807]: E1216 12:48:00.416985 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:48:00.418879 kubelet[2807]: E1216 12:48:00.418704 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:48:01.106000 audit[5026]: USER_ACCT pid=5026 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.107267 sshd[5026]: Accepted publickey for core from 147.75.109.163 port 49690 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:01.109797 sshd-session[5026]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:01.114658 kernel: audit: type=1101 audit(1765889281.106:740): pid=5026 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.108000 audit[5026]: CRED_ACQ pid=5026 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.123849 kernel: audit: type=1103 audit(1765889281.108:741): pid=5026 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.130645 kernel: audit: type=1006 audit(1765889281.108:742): pid=5026 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Dec 16 12:48:01.108000 audit[5026]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd9a3045f0 a2=3 a3=0 items=0 ppid=1 pid=5026 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:01.141769 kernel: audit: type=1300 audit(1765889281.108:742): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd9a3045f0 a2=3 a3=0 items=0 ppid=1 pid=5026 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:01.146567 kernel: audit: type=1327 audit(1765889281.108:742): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:01.108000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:01.146306 systemd-logind[1607]: New session 10 of user core. Dec 16 12:48:01.151930 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 12:48:01.156000 audit[5026]: USER_START pid=5026 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.175108 kernel: audit: type=1105 audit(1765889281.156:743): pid=5026 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.175286 kernel: audit: type=1103 audit(1765889281.166:744): pid=5030 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.166000 audit[5030]: CRED_ACQ pid=5030 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.416479 kubelet[2807]: E1216 12:48:01.415923 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:48:01.716849 sshd[5030]: Connection closed by 147.75.109.163 port 49690 Dec 16 12:48:01.717667 sshd-session[5026]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:01.719000 audit[5026]: USER_END pid=5026 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.722661 systemd[1]: sshd@8-77.42.23.34:22-147.75.109.163:49690.service: Deactivated successfully. Dec 16 12:48:01.730007 kernel: audit: type=1106 audit(1765889281.719:745): pid=5026 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.725160 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 12:48:01.730183 systemd-logind[1607]: Session 10 logged out. Waiting for processes to exit. Dec 16 12:48:01.719000 audit[5026]: CRED_DISP pid=5026 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.722000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.23.34:22-147.75.109.163:49690 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:01.737728 kernel: audit: type=1104 audit(1765889281.719:746): pid=5026 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:01.737398 systemd-logind[1607]: Removed session 10. Dec 16 12:48:01.886413 systemd[1]: Started sshd@9-77.42.23.34:22-147.75.109.163:49692.service - OpenSSH per-connection server daemon (147.75.109.163:49692). Dec 16 12:48:01.886000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.23.34:22-147.75.109.163:49692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:02.726000 audit[5043]: USER_ACCT pid=5043 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:02.727224 sshd[5043]: Accepted publickey for core from 147.75.109.163 port 49692 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:02.727000 audit[5043]: CRED_ACQ pid=5043 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:02.727000 audit[5043]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe017a8f70 a2=3 a3=0 items=0 ppid=1 pid=5043 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:02.727000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:02.728592 sshd-session[5043]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:02.733893 systemd-logind[1607]: New session 11 of user core. Dec 16 12:48:02.740837 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 12:48:02.743000 audit[5043]: USER_START pid=5043 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:02.745000 audit[5070]: CRED_ACQ pid=5070 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:03.369572 sshd[5070]: Connection closed by 147.75.109.163 port 49692 Dec 16 12:48:03.370056 sshd-session[5043]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:03.371000 audit[5043]: USER_END pid=5043 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:03.372000 audit[5043]: CRED_DISP pid=5043 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:03.374932 systemd[1]: sshd@9-77.42.23.34:22-147.75.109.163:49692.service: Deactivated successfully. Dec 16 12:48:03.376000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.23.34:22-147.75.109.163:49692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:03.378201 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 12:48:03.380703 systemd-logind[1607]: Session 11 logged out. Waiting for processes to exit. Dec 16 12:48:03.382492 systemd-logind[1607]: Removed session 11. Dec 16 12:48:03.582335 systemd[1]: Started sshd@10-77.42.23.34:22-147.75.109.163:57272.service - OpenSSH per-connection server daemon (147.75.109.163:57272). Dec 16 12:48:03.581000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.23.34:22-147.75.109.163:57272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:04.327598 update_engine[1609]: I20251216 12:48:04.327486 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:48:04.327950 update_engine[1609]: I20251216 12:48:04.327642 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:48:04.328060 update_engine[1609]: I20251216 12:48:04.328014 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:48:04.328425 update_engine[1609]: E20251216 12:48:04.328389 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:48:04.328561 update_engine[1609]: I20251216 12:48:04.328456 1609 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Dec 16 12:48:04.529000 audit[5082]: USER_ACCT pid=5082 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:04.530061 sshd[5082]: Accepted publickey for core from 147.75.109.163 port 57272 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:04.530000 audit[5082]: CRED_ACQ pid=5082 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:04.530000 audit[5082]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffec141dd30 a2=3 a3=0 items=0 ppid=1 pid=5082 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:04.530000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:04.531978 sshd-session[5082]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:04.543349 systemd-logind[1607]: New session 12 of user core. Dec 16 12:48:04.547657 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 12:48:04.550000 audit[5082]: USER_START pid=5082 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:04.553000 audit[5086]: CRED_ACQ pid=5086 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:05.132504 sshd[5086]: Connection closed by 147.75.109.163 port 57272 Dec 16 12:48:05.130944 sshd-session[5082]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:05.132000 audit[5082]: USER_END pid=5082 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:05.132000 audit[5082]: CRED_DISP pid=5082 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:05.135295 systemd-logind[1607]: Session 12 logged out. Waiting for processes to exit. Dec 16 12:48:05.136984 systemd[1]: sshd@10-77.42.23.34:22-147.75.109.163:57272.service: Deactivated successfully. Dec 16 12:48:05.136000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.23.34:22-147.75.109.163:57272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:05.139298 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 12:48:05.141800 systemd-logind[1607]: Removed session 12. Dec 16 12:48:08.413899 kubelet[2807]: E1216 12:48:08.413217 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:48:10.280417 systemd[1]: Started sshd@11-77.42.23.34:22-147.75.109.163:57282.service - OpenSSH per-connection server daemon (147.75.109.163:57282). Dec 16 12:48:10.285572 kernel: kauditd_printk_skb: 23 callbacks suppressed Dec 16 12:48:10.285613 kernel: audit: type=1130 audit(1765889290.279:766): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.23.34:22-147.75.109.163:57282 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:10.279000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.23.34:22-147.75.109.163:57282 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:11.131619 kernel: audit: type=1101 audit(1765889291.123:767): pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.123000 audit[5104]: USER_ACCT pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.125997 sshd-session[5104]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:11.132701 sshd[5104]: Accepted publickey for core from 147.75.109.163 port 57282 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:11.134657 systemd-logind[1607]: New session 13 of user core. Dec 16 12:48:11.123000 audit[5104]: CRED_ACQ pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.140105 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 12:48:11.143554 kernel: audit: type=1103 audit(1765889291.123:768): pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.150550 kernel: audit: type=1006 audit(1765889291.123:769): pid=5104 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Dec 16 12:48:11.123000 audit[5104]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc0ccaa890 a2=3 a3=0 items=0 ppid=1 pid=5104 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:11.161538 kernel: audit: type=1300 audit(1765889291.123:769): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc0ccaa890 a2=3 a3=0 items=0 ppid=1 pid=5104 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:11.123000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:11.166548 kernel: audit: type=1327 audit(1765889291.123:769): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:11.144000 audit[5104]: USER_START pid=5104 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.177552 kernel: audit: type=1105 audit(1765889291.144:770): pid=5104 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.149000 audit[5108]: CRED_ACQ pid=5108 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.187539 kernel: audit: type=1103 audit(1765889291.149:771): pid=5108 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.413885 kubelet[2807]: E1216 12:48:11.413774 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:48:11.669473 sshd[5108]: Connection closed by 147.75.109.163 port 57282 Dec 16 12:48:11.670142 sshd-session[5104]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:11.670000 audit[5104]: USER_END pid=5104 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.670000 audit[5104]: CRED_DISP pid=5104 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.682884 systemd-logind[1607]: Session 13 logged out. Waiting for processes to exit. Dec 16 12:48:11.683605 kernel: audit: type=1106 audit(1765889291.670:772): pid=5104 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.683649 kernel: audit: type=1104 audit(1765889291.670:773): pid=5104 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:11.684651 systemd[1]: sshd@11-77.42.23.34:22-147.75.109.163:57282.service: Deactivated successfully. Dec 16 12:48:11.687224 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 12:48:11.684000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.23.34:22-147.75.109.163:57282 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:11.692162 systemd-logind[1607]: Removed session 13. Dec 16 12:48:12.414046 kubelet[2807]: E1216 12:48:12.413693 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:48:13.415831 kubelet[2807]: E1216 12:48:13.414235 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:48:13.416622 kubelet[2807]: E1216 12:48:13.416587 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:48:14.329283 update_engine[1609]: I20251216 12:48:14.328563 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:48:14.329283 update_engine[1609]: I20251216 12:48:14.328649 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:48:14.329283 update_engine[1609]: I20251216 12:48:14.328999 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:48:14.330765 update_engine[1609]: E20251216 12:48:14.330742 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:48:14.330887 update_engine[1609]: I20251216 12:48:14.330870 1609 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Dec 16 12:48:15.414678 kubelet[2807]: E1216 12:48:15.414126 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:48:16.889669 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:48:16.889776 kernel: audit: type=1130 audit(1765889296.877:775): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.23.34:22-147.75.109.163:51270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:16.877000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.23.34:22-147.75.109.163:51270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:16.877752 systemd[1]: Started sshd@12-77.42.23.34:22-147.75.109.163:51270.service - OpenSSH per-connection server daemon (147.75.109.163:51270). Dec 16 12:48:17.815000 audit[5122]: USER_ACCT pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:17.824165 sshd[5122]: Accepted publickey for core from 147.75.109.163 port 51270 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:17.824916 kernel: audit: type=1101 audit(1765889297.815:776): pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:17.826439 sshd-session[5122]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:17.824000 audit[5122]: CRED_ACQ pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:17.836212 kernel: audit: type=1103 audit(1765889297.824:777): pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:17.836654 kernel: audit: type=1006 audit(1765889297.824:778): pid=5122 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Dec 16 12:48:17.824000 audit[5122]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc6a9aa5a0 a2=3 a3=0 items=0 ppid=1 pid=5122 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:17.841616 kernel: audit: type=1300 audit(1765889297.824:778): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc6a9aa5a0 a2=3 a3=0 items=0 ppid=1 pid=5122 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:17.824000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:17.849603 kernel: audit: type=1327 audit(1765889297.824:778): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:17.851684 systemd-logind[1607]: New session 14 of user core. Dec 16 12:48:17.858685 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 12:48:17.870052 kernel: audit: type=1105 audit(1765889297.860:779): pid=5122 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:17.860000 audit[5122]: USER_START pid=5122 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:17.869000 audit[5126]: CRED_ACQ pid=5126 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:17.877779 kernel: audit: type=1103 audit(1765889297.869:780): pid=5126 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:18.435773 sshd[5126]: Connection closed by 147.75.109.163 port 51270 Dec 16 12:48:18.437453 sshd-session[5122]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:18.439000 audit[5122]: USER_END pid=5122 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:18.454125 kernel: audit: type=1106 audit(1765889298.439:781): pid=5122 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:18.454542 systemd[1]: sshd@12-77.42.23.34:22-147.75.109.163:51270.service: Deactivated successfully. Dec 16 12:48:18.439000 audit[5122]: CRED_DISP pid=5122 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:18.465579 kernel: audit: type=1104 audit(1765889298.439:782): pid=5122 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:18.458211 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 12:48:18.462764 systemd-logind[1607]: Session 14 logged out. Waiting for processes to exit. Dec 16 12:48:18.464572 systemd-logind[1607]: Removed session 14. Dec 16 12:48:18.453000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.23.34:22-147.75.109.163:51270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:23.413624 kubelet[2807]: E1216 12:48:23.413570 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:48:23.590229 systemd[1]: Started sshd@13-77.42.23.34:22-147.75.109.163:35834.service - OpenSSH per-connection server daemon (147.75.109.163:35834). Dec 16 12:48:23.588000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.23.34:22-147.75.109.163:35834 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:23.601988 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:48:23.602064 kernel: audit: type=1130 audit(1765889303.588:784): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.23.34:22-147.75.109.163:35834 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:24.333466 update_engine[1609]: I20251216 12:48:24.332290 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:48:24.333466 update_engine[1609]: I20251216 12:48:24.332378 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:48:24.333466 update_engine[1609]: I20251216 12:48:24.332825 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:48:24.333466 update_engine[1609]: E20251216 12:48:24.333190 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:48:24.333466 update_engine[1609]: I20251216 12:48:24.333243 1609 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 16 12:48:24.333466 update_engine[1609]: I20251216 12:48:24.333249 1609 omaha_request_action.cc:617] Omaha request response: Dec 16 12:48:24.333466 update_engine[1609]: E20251216 12:48:24.333356 1609 omaha_request_action.cc:636] Omaha request network transfer failed. Dec 16 12:48:24.339926 update_engine[1609]: I20251216 12:48:24.339869 1609 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Dec 16 12:48:24.339926 update_engine[1609]: I20251216 12:48:24.339915 1609 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 12:48:24.339926 update_engine[1609]: I20251216 12:48:24.339921 1609 update_attempter.cc:306] Processing Done. Dec 16 12:48:24.340072 update_engine[1609]: E20251216 12:48:24.339938 1609 update_attempter.cc:619] Update failed. Dec 16 12:48:24.340072 update_engine[1609]: I20251216 12:48:24.339943 1609 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Dec 16 12:48:24.340072 update_engine[1609]: I20251216 12:48:24.339946 1609 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Dec 16 12:48:24.340072 update_engine[1609]: I20251216 12:48:24.339962 1609 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Dec 16 12:48:24.340072 update_engine[1609]: I20251216 12:48:24.340042 1609 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 16 12:48:24.340072 update_engine[1609]: I20251216 12:48:24.340068 1609 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 16 12:48:24.340072 update_engine[1609]: I20251216 12:48:24.340073 1609 omaha_request_action.cc:272] Request: Dec 16 12:48:24.340072 update_engine[1609]: Dec 16 12:48:24.340072 update_engine[1609]: Dec 16 12:48:24.340072 update_engine[1609]: Dec 16 12:48:24.340072 update_engine[1609]: Dec 16 12:48:24.340072 update_engine[1609]: Dec 16 12:48:24.340072 update_engine[1609]: Dec 16 12:48:24.340072 update_engine[1609]: I20251216 12:48:24.340078 1609 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:48:24.340284 update_engine[1609]: I20251216 12:48:24.340106 1609 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:48:24.340470 update_engine[1609]: I20251216 12:48:24.340435 1609 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:48:24.342158 locksmithd[1661]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Dec 16 12:48:24.342370 update_engine[1609]: E20251216 12:48:24.342208 1609 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:48:24.342370 update_engine[1609]: I20251216 12:48:24.342260 1609 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 16 12:48:24.342370 update_engine[1609]: I20251216 12:48:24.342267 1609 omaha_request_action.cc:617] Omaha request response: Dec 16 12:48:24.342370 update_engine[1609]: I20251216 12:48:24.342271 1609 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 12:48:24.342370 update_engine[1609]: I20251216 12:48:24.342274 1609 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 12:48:24.342370 update_engine[1609]: I20251216 12:48:24.342278 1609 update_attempter.cc:306] Processing Done. Dec 16 12:48:24.342370 update_engine[1609]: I20251216 12:48:24.342281 1609 update_attempter.cc:310] Error event sent. Dec 16 12:48:24.342370 update_engine[1609]: I20251216 12:48:24.342288 1609 update_check_scheduler.cc:74] Next update check in 42m20s Dec 16 12:48:24.342691 locksmithd[1661]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Dec 16 12:48:24.413679 kubelet[2807]: E1216 12:48:24.413423 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:48:24.413679 kubelet[2807]: E1216 12:48:24.413558 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:48:24.432000 audit[5138]: USER_ACCT pid=5138 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:24.437932 sshd[5138]: Accepted publickey for core from 147.75.109.163 port 35834 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:24.441024 sshd-session[5138]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:24.443622 kernel: audit: type=1101 audit(1765889304.432:785): pid=5138 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:24.446663 systemd-logind[1607]: New session 15 of user core. Dec 16 12:48:24.435000 audit[5138]: CRED_ACQ pid=5138 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:24.459615 kernel: audit: type=1103 audit(1765889304.435:786): pid=5138 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:24.459678 kernel: audit: type=1006 audit(1765889304.435:787): pid=5138 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Dec 16 12:48:24.435000 audit[5138]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffce4f8efe0 a2=3 a3=0 items=0 ppid=1 pid=5138 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:24.472550 kernel: audit: type=1300 audit(1765889304.435:787): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffce4f8efe0 a2=3 a3=0 items=0 ppid=1 pid=5138 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:24.472638 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 12:48:24.435000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:24.477061 kernel: audit: type=1327 audit(1765889304.435:787): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:24.478000 audit[5138]: USER_START pid=5138 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:24.482000 audit[5142]: CRED_ACQ pid=5142 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:24.494238 kernel: audit: type=1105 audit(1765889304.478:788): pid=5138 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:24.494303 kernel: audit: type=1103 audit(1765889304.482:789): pid=5142 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:25.031586 sshd[5142]: Connection closed by 147.75.109.163 port 35834 Dec 16 12:48:25.032836 sshd-session[5138]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:25.033000 audit[5138]: USER_END pid=5138 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:25.044625 kernel: audit: type=1106 audit(1765889305.033:790): pid=5138 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:25.045074 systemd[1]: sshd@13-77.42.23.34:22-147.75.109.163:35834.service: Deactivated successfully. Dec 16 12:48:25.047482 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 12:48:25.050212 systemd-logind[1607]: Session 15 logged out. Waiting for processes to exit. Dec 16 12:48:25.051601 systemd-logind[1607]: Removed session 15. Dec 16 12:48:25.033000 audit[5138]: CRED_DISP pid=5138 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:25.043000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.23.34:22-147.75.109.163:35834 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:25.059576 kernel: audit: type=1104 audit(1765889305.033:791): pid=5138 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:25.237981 systemd[1]: Started sshd@14-77.42.23.34:22-147.75.109.163:35842.service - OpenSSH per-connection server daemon (147.75.109.163:35842). Dec 16 12:48:25.236000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-77.42.23.34:22-147.75.109.163:35842 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:25.414832 containerd[1624]: time="2025-12-16T12:48:25.414793410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:48:25.859289 containerd[1624]: time="2025-12-16T12:48:25.859240906Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:48:25.860503 containerd[1624]: time="2025-12-16T12:48:25.860459114Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:48:25.860649 containerd[1624]: time="2025-12-16T12:48:25.860585051Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:48:25.860882 kubelet[2807]: E1216 12:48:25.860826 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:48:25.860882 kubelet[2807]: E1216 12:48:25.860873 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:48:25.861558 kubelet[2807]: E1216 12:48:25.861006 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:48:25.862106 kubelet[2807]: E1216 12:48:25.862070 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:48:26.194000 audit[5156]: USER_ACCT pid=5156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:26.196163 sshd[5156]: Accepted publickey for core from 147.75.109.163 port 35842 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:26.195000 audit[5156]: CRED_ACQ pid=5156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:26.195000 audit[5156]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe2f2ef670 a2=3 a3=0 items=0 ppid=1 pid=5156 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:26.195000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:26.197951 sshd-session[5156]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:26.202762 systemd-logind[1607]: New session 16 of user core. Dec 16 12:48:26.205693 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 12:48:26.206000 audit[5156]: USER_START pid=5156 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:26.208000 audit[5160]: CRED_ACQ pid=5160 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:26.414222 containerd[1624]: time="2025-12-16T12:48:26.414175976Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:48:26.868752 containerd[1624]: time="2025-12-16T12:48:26.868515803Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:48:26.870282 containerd[1624]: time="2025-12-16T12:48:26.870224386Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:48:26.870718 containerd[1624]: time="2025-12-16T12:48:26.870393845Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:48:26.871632 kubelet[2807]: E1216 12:48:26.871599 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:48:26.873376 kubelet[2807]: E1216 12:48:26.871862 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:48:26.873376 kubelet[2807]: E1216 12:48:26.871986 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sdw6q_calico-system(db9e60f0-9cc3-4000-b32c-9ba313d4b676): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:48:26.873376 kubelet[2807]: E1216 12:48:26.872036 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:48:27.102697 sshd[5160]: Connection closed by 147.75.109.163 port 35842 Dec 16 12:48:27.112697 sshd-session[5156]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:27.115000 audit[5156]: USER_END pid=5156 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:27.116000 audit[5156]: CRED_DISP pid=5156 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:27.121957 systemd-logind[1607]: Session 16 logged out. Waiting for processes to exit. Dec 16 12:48:27.122438 systemd[1]: sshd@14-77.42.23.34:22-147.75.109.163:35842.service: Deactivated successfully. Dec 16 12:48:27.121000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-77.42.23.34:22-147.75.109.163:35842 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:27.125143 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 12:48:27.131010 systemd-logind[1607]: Removed session 16. Dec 16 12:48:27.253846 systemd[1]: Started sshd@15-77.42.23.34:22-147.75.109.163:35854.service - OpenSSH per-connection server daemon (147.75.109.163:35854). Dec 16 12:48:27.253000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-77.42.23.34:22-147.75.109.163:35854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:28.108000 audit[5171]: USER_ACCT pid=5171 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:28.110000 sshd[5171]: Accepted publickey for core from 147.75.109.163 port 35854 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:28.109000 audit[5171]: CRED_ACQ pid=5171 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:28.109000 audit[5171]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff2e89200 a2=3 a3=0 items=0 ppid=1 pid=5171 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:28.109000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:28.112364 sshd-session[5171]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:28.118652 systemd-logind[1607]: New session 17 of user core. Dec 16 12:48:28.122721 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 12:48:28.126000 audit[5171]: USER_START pid=5171 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:28.127000 audit[5183]: CRED_ACQ pid=5183 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:29.189000 audit[5193]: NETFILTER_CFG table=filter:129 family=2 entries=26 op=nft_register_rule pid=5193 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:48:29.194483 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:48:29.194559 kernel: audit: type=1325 audit(1765889309.189:808): table=filter:129 family=2 entries=26 op=nft_register_rule pid=5193 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:48:29.189000 audit[5193]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffef7e14220 a2=0 a3=7ffef7e1420c items=0 ppid=2972 pid=5193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:29.208546 kernel: audit: type=1300 audit(1765889309.189:808): arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffef7e14220 a2=0 a3=7ffef7e1420c items=0 ppid=2972 pid=5193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:29.189000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:48:29.214214 kernel: audit: type=1327 audit(1765889309.189:808): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:48:29.214789 kernel: audit: type=1325 audit(1765889309.198:809): table=nat:130 family=2 entries=20 op=nft_register_rule pid=5193 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:48:29.198000 audit[5193]: NETFILTER_CFG table=nat:130 family=2 entries=20 op=nft_register_rule pid=5193 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:48:29.198000 audit[5193]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffef7e14220 a2=0 a3=0 items=0 ppid=2972 pid=5193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:29.198000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:48:29.228095 kernel: audit: type=1300 audit(1765889309.198:809): arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffef7e14220 a2=0 a3=0 items=0 ppid=2972 pid=5193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:29.228203 kernel: audit: type=1327 audit(1765889309.198:809): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:48:29.347305 sshd[5183]: Connection closed by 147.75.109.163 port 35854 Dec 16 12:48:29.347726 sshd-session[5171]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:29.350000 audit[5171]: USER_END pid=5171 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:29.356347 systemd[1]: sshd@15-77.42.23.34:22-147.75.109.163:35854.service: Deactivated successfully. Dec 16 12:48:29.359215 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 12:48:29.361607 kernel: audit: type=1106 audit(1765889309.350:810): pid=5171 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:29.362310 kernel: audit: type=1104 audit(1765889309.350:811): pid=5171 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:29.350000 audit[5171]: CRED_DISP pid=5171 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:29.362642 systemd-logind[1607]: Session 17 logged out. Waiting for processes to exit. Dec 16 12:48:29.363760 systemd-logind[1607]: Removed session 17. Dec 16 12:48:29.355000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-77.42.23.34:22-147.75.109.163:35854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:29.370189 kernel: audit: type=1131 audit(1765889309.355:812): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-77.42.23.34:22-147.75.109.163:35854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:29.511434 systemd[1]: Started sshd@16-77.42.23.34:22-147.75.109.163:35870.service - OpenSSH per-connection server daemon (147.75.109.163:35870). Dec 16 12:48:29.510000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.23.34:22-147.75.109.163:35870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:29.519566 kernel: audit: type=1130 audit(1765889309.510:813): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.23.34:22-147.75.109.163:35870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:30.223000 audit[5202]: NETFILTER_CFG table=filter:131 family=2 entries=38 op=nft_register_rule pid=5202 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:48:30.223000 audit[5202]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffc3b3c50a0 a2=0 a3=7ffc3b3c508c items=0 ppid=2972 pid=5202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:30.223000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:48:30.225000 audit[5202]: NETFILTER_CFG table=nat:132 family=2 entries=20 op=nft_register_rule pid=5202 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:48:30.225000 audit[5202]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffc3b3c50a0 a2=0 a3=0 items=0 ppid=2972 pid=5202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:30.225000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:48:30.368670 sshd[5198]: Accepted publickey for core from 147.75.109.163 port 35870 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:30.367000 audit[5198]: USER_ACCT pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:30.368000 audit[5198]: CRED_ACQ pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:30.369000 audit[5198]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff2e203850 a2=3 a3=0 items=0 ppid=1 pid=5198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:30.369000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:30.371910 sshd-session[5198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:30.377888 systemd-logind[1607]: New session 18 of user core. Dec 16 12:48:30.384668 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 12:48:30.386000 audit[5198]: USER_START pid=5198 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:30.388000 audit[5204]: CRED_ACQ pid=5204 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:30.417300 containerd[1624]: time="2025-12-16T12:48:30.417240930Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:48:30.853459 containerd[1624]: time="2025-12-16T12:48:30.853399196Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:48:30.854577 containerd[1624]: time="2025-12-16T12:48:30.854500963Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:48:30.854662 containerd[1624]: time="2025-12-16T12:48:30.854635016Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:48:30.854945 kubelet[2807]: E1216 12:48:30.854877 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:48:30.855236 kubelet[2807]: E1216 12:48:30.854965 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:48:30.855236 kubelet[2807]: E1216 12:48:30.855215 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-78c5bf4bd4-c4c2s_calico-system(373b77f0-3f56-4579-96a0-a033d280b187): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:48:30.857332 containerd[1624]: time="2025-12-16T12:48:30.857303800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:48:31.121735 sshd[5204]: Connection closed by 147.75.109.163 port 35870 Dec 16 12:48:31.122759 sshd-session[5198]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:31.123000 audit[5198]: USER_END pid=5198 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:31.124000 audit[5198]: CRED_DISP pid=5198 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:31.128448 systemd-logind[1607]: Session 18 logged out. Waiting for processes to exit. Dec 16 12:48:31.129205 systemd[1]: sshd@16-77.42.23.34:22-147.75.109.163:35870.service: Deactivated successfully. Dec 16 12:48:31.128000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.23.34:22-147.75.109.163:35870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:31.133089 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 12:48:31.136086 systemd-logind[1607]: Removed session 18. Dec 16 12:48:31.293000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-77.42.23.34:22-147.75.109.163:35876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:31.294667 systemd[1]: Started sshd@17-77.42.23.34:22-147.75.109.163:35876.service - OpenSSH per-connection server daemon (147.75.109.163:35876). Dec 16 12:48:31.306314 containerd[1624]: time="2025-12-16T12:48:31.305929128Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:48:31.308101 containerd[1624]: time="2025-12-16T12:48:31.308026324Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:48:31.308431 containerd[1624]: time="2025-12-16T12:48:31.308400880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:48:31.309021 kubelet[2807]: E1216 12:48:31.308781 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:48:31.309021 kubelet[2807]: E1216 12:48:31.308819 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:48:31.309021 kubelet[2807]: E1216 12:48:31.308879 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-78c5bf4bd4-c4c2s_calico-system(373b77f0-3f56-4579-96a0-a033d280b187): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:48:31.309021 kubelet[2807]: E1216 12:48:31.308914 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:48:32.135000 audit[5214]: USER_ACCT pid=5214 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:32.137654 sshd[5214]: Accepted publickey for core from 147.75.109.163 port 35876 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:32.137000 audit[5214]: CRED_ACQ pid=5214 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:32.137000 audit[5214]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeba28b780 a2=3 a3=0 items=0 ppid=1 pid=5214 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:32.137000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:32.139375 sshd-session[5214]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:32.143921 systemd-logind[1607]: New session 19 of user core. Dec 16 12:48:32.149745 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 12:48:32.155000 audit[5214]: USER_START pid=5214 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:32.158000 audit[5218]: CRED_ACQ pid=5218 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:32.705611 sshd[5218]: Connection closed by 147.75.109.163 port 35876 Dec 16 12:48:32.707703 sshd-session[5214]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:32.709000 audit[5214]: USER_END pid=5214 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:32.711000 audit[5214]: CRED_DISP pid=5214 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:32.716000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-77.42.23.34:22-147.75.109.163:35876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:32.717614 systemd[1]: sshd@17-77.42.23.34:22-147.75.109.163:35876.service: Deactivated successfully. Dec 16 12:48:32.720954 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 12:48:32.723319 systemd-logind[1607]: Session 19 logged out. Waiting for processes to exit. Dec 16 12:48:32.725842 systemd-logind[1607]: Removed session 19. Dec 16 12:48:33.738000 audit[5254]: NETFILTER_CFG table=filter:133 family=2 entries=26 op=nft_register_rule pid=5254 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:48:33.738000 audit[5254]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffea7133fd0 a2=0 a3=7ffea7133fbc items=0 ppid=2972 pid=5254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:33.738000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:48:33.748000 audit[5254]: NETFILTER_CFG table=nat:134 family=2 entries=104 op=nft_register_chain pid=5254 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:48:33.748000 audit[5254]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffea7133fd0 a2=0 a3=7ffea7133fbc items=0 ppid=2972 pid=5254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:33.748000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:48:34.416420 containerd[1624]: time="2025-12-16T12:48:34.416372418Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:48:34.878334 containerd[1624]: time="2025-12-16T12:48:34.878170319Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:48:34.879337 containerd[1624]: time="2025-12-16T12:48:34.879305109Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:48:34.879618 containerd[1624]: time="2025-12-16T12:48:34.879425647Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:48:34.879776 kubelet[2807]: E1216 12:48:34.879744 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:48:34.881097 kubelet[2807]: E1216 12:48:34.880616 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:48:34.881097 kubelet[2807]: E1216 12:48:34.880733 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-9f54fc4f9-7wdcc_calico-apiserver(1081c7e3-0645-4858-ac62-90f635bb19a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:48:34.881097 kubelet[2807]: E1216 12:48:34.880766 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:48:36.418766 containerd[1624]: time="2025-12-16T12:48:36.418718110Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:48:36.860373 containerd[1624]: time="2025-12-16T12:48:36.860318042Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:48:36.861357 containerd[1624]: time="2025-12-16T12:48:36.861303440Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:48:36.861438 containerd[1624]: time="2025-12-16T12:48:36.861410522Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:48:36.862714 kubelet[2807]: E1216 12:48:36.862672 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:48:36.863009 kubelet[2807]: E1216 12:48:36.862721 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:48:36.863009 kubelet[2807]: E1216 12:48:36.862876 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6df777597b-rfdvx_calico-system(ae41bde4-2571-4fb7-adb8-a8808483af15): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:48:36.863009 kubelet[2807]: E1216 12:48:36.862909 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:48:36.863412 containerd[1624]: time="2025-12-16T12:48:36.863369757Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:48:37.329862 containerd[1624]: time="2025-12-16T12:48:37.329809805Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:48:37.331175 containerd[1624]: time="2025-12-16T12:48:37.331116880Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:48:37.331228 containerd[1624]: time="2025-12-16T12:48:37.331206018Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:48:37.331431 kubelet[2807]: E1216 12:48:37.331381 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:48:37.331431 kubelet[2807]: E1216 12:48:37.331430 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:48:37.331522 kubelet[2807]: E1216 12:48:37.331508 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-9f54fc4f9-v6lkx_calico-apiserver(50e78f30-18dc-4823-b78b-3500e19d4f6f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:48:37.331585 kubelet[2807]: E1216 12:48:37.331564 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:48:37.417344 containerd[1624]: time="2025-12-16T12:48:37.417260435Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:48:37.854616 containerd[1624]: time="2025-12-16T12:48:37.854489531Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:48:37.856586 containerd[1624]: time="2025-12-16T12:48:37.856324882Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:48:37.856586 containerd[1624]: time="2025-12-16T12:48:37.856359168Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:48:37.856847 kubelet[2807]: E1216 12:48:37.856777 2807 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:48:37.856942 kubelet[2807]: E1216 12:48:37.856852 2807 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:48:37.856983 kubelet[2807]: E1216 12:48:37.856957 2807 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-h22qr_calico-system(eff3d741-729e-4ed9-a6a3-d314f99d7c29): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:48:37.857064 kubelet[2807]: E1216 12:48:37.857017 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:48:37.920571 kernel: kauditd_printk_skb: 33 callbacks suppressed Dec 16 12:48:37.920695 kernel: audit: type=1130 audit(1765889317.916:835): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.23.34:22-147.75.109.163:54356 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:37.916000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.23.34:22-147.75.109.163:54356 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:37.917923 systemd[1]: Started sshd@18-77.42.23.34:22-147.75.109.163:54356.service - OpenSSH per-connection server daemon (147.75.109.163:54356). Dec 16 12:48:38.866622 kernel: audit: type=1101 audit(1765889318.855:836): pid=5257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:38.855000 audit[5257]: USER_ACCT pid=5257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:38.866939 sshd[5257]: Accepted publickey for core from 147.75.109.163 port 54356 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:38.872055 sshd-session[5257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:38.869000 audit[5257]: CRED_ACQ pid=5257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:38.881633 kernel: audit: type=1103 audit(1765889318.869:837): pid=5257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:38.886715 systemd-logind[1607]: New session 20 of user core. Dec 16 12:48:38.893546 kernel: audit: type=1006 audit(1765889318.869:838): pid=5257 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Dec 16 12:48:38.893899 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 12:48:38.869000 audit[5257]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe43830cc0 a2=3 a3=0 items=0 ppid=1 pid=5257 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:38.903969 kernel: audit: type=1300 audit(1765889318.869:838): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe43830cc0 a2=3 a3=0 items=0 ppid=1 pid=5257 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:38.869000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:38.910551 kernel: audit: type=1327 audit(1765889318.869:838): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:38.903000 audit[5257]: USER_START pid=5257 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:38.920619 kernel: audit: type=1105 audit(1765889318.903:839): pid=5257 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:38.905000 audit[5261]: CRED_ACQ pid=5261 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:38.930731 kernel: audit: type=1103 audit(1765889318.905:840): pid=5261 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:39.499314 sshd[5261]: Connection closed by 147.75.109.163 port 54356 Dec 16 12:48:39.500064 sshd-session[5257]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:39.500000 audit[5257]: USER_END pid=5257 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:39.509978 systemd[1]: sshd@18-77.42.23.34:22-147.75.109.163:54356.service: Deactivated successfully. Dec 16 12:48:39.518718 kernel: audit: type=1106 audit(1765889319.500:841): pid=5257 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:39.518793 kernel: audit: type=1104 audit(1765889319.500:842): pid=5257 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:39.500000 audit[5257]: CRED_DISP pid=5257 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:39.512938 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 12:48:39.515232 systemd-logind[1607]: Session 20 logged out. Waiting for processes to exit. Dec 16 12:48:39.516682 systemd-logind[1607]: Removed session 20. Dec 16 12:48:39.509000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.23.34:22-147.75.109.163:54356 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:40.416925 kubelet[2807]: E1216 12:48:40.416856 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:48:44.414819 kubelet[2807]: E1216 12:48:44.414725 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:48:44.689514 systemd[1]: Started sshd@19-77.42.23.34:22-147.75.109.163:49648.service - OpenSSH per-connection server daemon (147.75.109.163:49648). Dec 16 12:48:44.688000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.23.34:22-147.75.109.163:49648 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:44.691242 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:48:44.691335 kernel: audit: type=1130 audit(1765889324.688:844): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.23.34:22-147.75.109.163:49648 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:45.662000 audit[5295]: USER_ACCT pid=5295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:45.667215 sshd-session[5295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:48:45.669791 sshd[5295]: Accepted publickey for core from 147.75.109.163 port 49648 ssh2: RSA SHA256:EtC75xtaTSJ+84wEKbjmapFbgFAJxSn7WaNHLp2aTq4 Dec 16 12:48:45.679251 systemd-logind[1607]: New session 21 of user core. Dec 16 12:48:45.662000 audit[5295]: CRED_ACQ pid=5295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:45.684607 kernel: audit: type=1101 audit(1765889325.662:845): pid=5295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:45.684709 kernel: audit: type=1103 audit(1765889325.662:846): pid=5295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:45.699061 kernel: audit: type=1006 audit(1765889325.662:847): pid=5295 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Dec 16 12:48:45.704564 kernel: audit: type=1300 audit(1765889325.662:847): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeda3109d0 a2=3 a3=0 items=0 ppid=1 pid=5295 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:45.662000 audit[5295]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeda3109d0 a2=3 a3=0 items=0 ppid=1 pid=5295 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:48:45.703869 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 12:48:45.714601 kernel: audit: type=1327 audit(1765889325.662:847): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:45.662000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:48:45.711000 audit[5295]: USER_START pid=5295 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:45.713000 audit[5299]: CRED_ACQ pid=5299 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:45.726967 kernel: audit: type=1105 audit(1765889325.711:848): pid=5295 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:45.727018 kernel: audit: type=1103 audit(1765889325.713:849): pid=5299 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:46.304993 sshd[5299]: Connection closed by 147.75.109.163 port 49648 Dec 16 12:48:46.306199 sshd-session[5295]: pam_unix(sshd:session): session closed for user core Dec 16 12:48:46.308000 audit[5295]: USER_END pid=5295 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:46.320598 kernel: audit: type=1106 audit(1765889326.308:850): pid=5295 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:46.320665 kernel: audit: type=1104 audit(1765889326.312:851): pid=5295 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:46.312000 audit[5295]: CRED_DISP pid=5295 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=147.75.109.163 addr=147.75.109.163 terminal=ssh res=success' Dec 16 12:48:46.324854 systemd-logind[1607]: Session 21 logged out. Waiting for processes to exit. Dec 16 12:48:46.326744 systemd[1]: sshd@19-77.42.23.34:22-147.75.109.163:49648.service: Deactivated successfully. Dec 16 12:48:46.325000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.23.34:22-147.75.109.163:49648 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:48:46.328965 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 12:48:46.331390 systemd-logind[1607]: Removed session 21. Dec 16 12:48:48.415549 kubelet[2807]: E1216 12:48:48.413460 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:48:49.413737 kubelet[2807]: E1216 12:48:49.413671 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:48:50.416186 kubelet[2807]: E1216 12:48:50.416145 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:48:51.415979 kubelet[2807]: E1216 12:48:51.415916 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:48:54.417486 kubelet[2807]: E1216 12:48:54.416880 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:48:59.416628 kubelet[2807]: E1216 12:48:59.416281 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:49:01.415461 kubelet[2807]: E1216 12:49:01.414796 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:49:02.416402 kubelet[2807]: E1216 12:49:02.416363 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:49:03.412962 kubelet[2807]: E1216 12:49:03.412873 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:49:03.414020 kubelet[2807]: E1216 12:49:03.413868 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:49:05.414177 kubelet[2807]: E1216 12:49:05.413790 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:49:12.413755 kubelet[2807]: E1216 12:49:12.413617 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187" Dec 16 12:49:14.413180 kubelet[2807]: E1216 12:49:14.412932 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-7wdcc" podUID="1081c7e3-0645-4858-ac62-90f635bb19a9" Dec 16 12:49:16.414088 kubelet[2807]: E1216 12:49:16.414011 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9f54fc4f9-v6lkx" podUID="50e78f30-18dc-4823-b78b-3500e19d4f6f" Dec 16 12:49:16.414754 kubelet[2807]: E1216 12:49:16.414616 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6df777597b-rfdvx" podUID="ae41bde4-2571-4fb7-adb8-a8808483af15" Dec 16 12:49:16.415315 kubelet[2807]: E1216 12:49:16.414960 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-h22qr" podUID="eff3d741-729e-4ed9-a6a3-d314f99d7c29" Dec 16 12:49:18.232383 systemd[1]: cri-containerd-de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398.scope: Deactivated successfully. Dec 16 12:49:18.235589 systemd[1]: cri-containerd-de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398.scope: Consumed 3.357s CPU time, 86.8M memory peak, 54.5M read from disk. Dec 16 12:49:18.242850 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:49:18.242935 kernel: audit: type=1334 audit(1765889358.237:853): prog-id=256 op=LOAD Dec 16 12:49:18.237000 audit: BPF prog-id=256 op=LOAD Dec 16 12:49:18.246738 kernel: audit: type=1334 audit(1765889358.241:854): prog-id=84 op=UNLOAD Dec 16 12:49:18.241000 audit: BPF prog-id=84 op=UNLOAD Dec 16 12:49:18.243000 audit: BPF prog-id=103 op=UNLOAD Dec 16 12:49:18.253671 kernel: audit: type=1334 audit(1765889358.243:855): prog-id=103 op=UNLOAD Dec 16 12:49:18.253754 kernel: audit: type=1334 audit(1765889358.243:856): prog-id=107 op=UNLOAD Dec 16 12:49:18.243000 audit: BPF prog-id=107 op=UNLOAD Dec 16 12:49:18.373842 containerd[1624]: time="2025-12-16T12:49:18.373779483Z" level=info msg="received container exit event container_id:\"de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398\" id:\"de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398\" pid:2635 exit_status:1 exited_at:{seconds:1765889358 nanos:291264975}" Dec 16 12:49:18.471094 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398-rootfs.mount: Deactivated successfully. Dec 16 12:49:18.513978 kubelet[2807]: E1216 12:49:18.513866 2807 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:58162->10.0.0.2:2379: read: connection timed out" Dec 16 12:49:19.179390 kubelet[2807]: I1216 12:49:19.179313 2807 scope.go:117] "RemoveContainer" containerID="de0fa444d3c8b2924f42005b8d77b50749fcae456655d3ff726138da9c901398" Dec 16 12:49:19.203022 containerd[1624]: time="2025-12-16T12:49:19.202978453Z" level=info msg="CreateContainer within sandbox \"f4468fd9d0dad3406d86c94157e59de0a077e81ba3c30ef22106a807e960e4de\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 12:49:19.277805 containerd[1624]: time="2025-12-16T12:49:19.277736738Z" level=info msg="Container ca99fdb9dcc2b6c56bcf95771f76122f5c20b7fbfc211432951b0d37d791ad7d: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:49:19.283472 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3793734689.mount: Deactivated successfully. Dec 16 12:49:19.299741 containerd[1624]: time="2025-12-16T12:49:19.299697818Z" level=info msg="CreateContainer within sandbox \"f4468fd9d0dad3406d86c94157e59de0a077e81ba3c30ef22106a807e960e4de\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"ca99fdb9dcc2b6c56bcf95771f76122f5c20b7fbfc211432951b0d37d791ad7d\"" Dec 16 12:49:19.300487 containerd[1624]: time="2025-12-16T12:49:19.300449305Z" level=info msg="StartContainer for \"ca99fdb9dcc2b6c56bcf95771f76122f5c20b7fbfc211432951b0d37d791ad7d\"" Dec 16 12:49:19.315521 containerd[1624]: time="2025-12-16T12:49:19.315470362Z" level=info msg="connecting to shim ca99fdb9dcc2b6c56bcf95771f76122f5c20b7fbfc211432951b0d37d791ad7d" address="unix:///run/containerd/s/3ed8244b314c48a7828785cebabb7e568327b088ed48a719a2908f2a152beec6" protocol=ttrpc version=3 Dec 16 12:49:19.345898 systemd[1]: Started cri-containerd-ca99fdb9dcc2b6c56bcf95771f76122f5c20b7fbfc211432951b0d37d791ad7d.scope - libcontainer container ca99fdb9dcc2b6c56bcf95771f76122f5c20b7fbfc211432951b0d37d791ad7d. Dec 16 12:49:19.369000 audit: BPF prog-id=257 op=LOAD Dec 16 12:49:19.375561 kernel: audit: type=1334 audit(1765889359.369:857): prog-id=257 op=LOAD Dec 16 12:49:19.371000 audit: BPF prog-id=258 op=LOAD Dec 16 12:49:19.390043 kernel: audit: type=1334 audit(1765889359.371:858): prog-id=258 op=LOAD Dec 16 12:49:19.390153 kernel: audit: type=1300 audit(1765889359.371:858): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2501 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:19.371000 audit[5349]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2501 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:19.371000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361393966646239646363326236633536626366393537373166373631 Dec 16 12:49:19.403135 kernel: audit: type=1327 audit(1765889359.371:858): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361393966646239646363326236633536626366393537373166373631 Dec 16 12:49:19.403213 kernel: audit: type=1334 audit(1765889359.371:859): prog-id=258 op=UNLOAD Dec 16 12:49:19.371000 audit: BPF prog-id=258 op=UNLOAD Dec 16 12:49:19.411417 kernel: audit: type=1300 audit(1765889359.371:859): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:19.371000 audit[5349]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:19.371000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361393966646239646363326236633536626366393537373166373631 Dec 16 12:49:19.372000 audit: BPF prog-id=259 op=LOAD Dec 16 12:49:19.372000 audit[5349]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2501 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:19.372000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361393966646239646363326236633536626366393537373166373631 Dec 16 12:49:19.372000 audit: BPF prog-id=260 op=LOAD Dec 16 12:49:19.372000 audit[5349]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2501 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:19.372000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361393966646239646363326236633536626366393537373166373631 Dec 16 12:49:19.372000 audit: BPF prog-id=260 op=UNLOAD Dec 16 12:49:19.372000 audit[5349]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:19.372000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361393966646239646363326236633536626366393537373166373631 Dec 16 12:49:19.372000 audit: BPF prog-id=259 op=UNLOAD Dec 16 12:49:19.372000 audit[5349]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2501 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:19.372000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361393966646239646363326236633536626366393537373166373631 Dec 16 12:49:19.372000 audit: BPF prog-id=261 op=LOAD Dec 16 12:49:19.372000 audit[5349]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2501 pid=5349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:19.372000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361393966646239646363326236633536626366393537373166373631 Dec 16 12:49:19.435558 containerd[1624]: time="2025-12-16T12:49:19.434511399Z" level=info msg="StartContainer for \"ca99fdb9dcc2b6c56bcf95771f76122f5c20b7fbfc211432951b0d37d791ad7d\" returns successfully" Dec 16 12:49:19.879940 systemd[1]: cri-containerd-ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e.scope: Deactivated successfully. Dec 16 12:49:19.880227 systemd[1]: cri-containerd-ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e.scope: Consumed 19.502s CPU time, 122.3M memory peak, 46.1M read from disk. Dec 16 12:49:19.881737 containerd[1624]: time="2025-12-16T12:49:19.881683638Z" level=info msg="received container exit event container_id:\"ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e\" id:\"ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e\" pid:3160 exit_status:1 exited_at:{seconds:1765889359 nanos:881071754}" Dec 16 12:49:19.884000 audit: BPF prog-id=146 op=UNLOAD Dec 16 12:49:19.884000 audit: BPF prog-id=150 op=UNLOAD Dec 16 12:49:19.916812 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e-rootfs.mount: Deactivated successfully. Dec 16 12:49:20.181053 kubelet[2807]: I1216 12:49:20.180525 2807 scope.go:117] "RemoveContainer" containerID="ab21ada075eba70c517f2402a4ff6315fe1cc5c2efcd5a1d8059349027b8320e" Dec 16 12:49:20.188542 containerd[1624]: time="2025-12-16T12:49:20.187691392Z" level=info msg="CreateContainer within sandbox \"0b1dcdf3a200f24cb7aee7607d6797553e294aea2ef5c50604e6aa85fea57c74\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 12:49:20.200011 containerd[1624]: time="2025-12-16T12:49:20.199980088Z" level=info msg="Container e25bd341ccd1aab4859c208c61e8b542524a8a4f060010256bf620c54a2e86ed: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:49:20.208094 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1503622207.mount: Deactivated successfully. Dec 16 12:49:20.211205 containerd[1624]: time="2025-12-16T12:49:20.211177989Z" level=info msg="CreateContainer within sandbox \"0b1dcdf3a200f24cb7aee7607d6797553e294aea2ef5c50604e6aa85fea57c74\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"e25bd341ccd1aab4859c208c61e8b542524a8a4f060010256bf620c54a2e86ed\"" Dec 16 12:49:20.211862 containerd[1624]: time="2025-12-16T12:49:20.211838644Z" level=info msg="StartContainer for \"e25bd341ccd1aab4859c208c61e8b542524a8a4f060010256bf620c54a2e86ed\"" Dec 16 12:49:20.212676 containerd[1624]: time="2025-12-16T12:49:20.212652147Z" level=info msg="connecting to shim e25bd341ccd1aab4859c208c61e8b542524a8a4f060010256bf620c54a2e86ed" address="unix:///run/containerd/s/db2811244de4879070c05f3deff7168fd40f9d31090d2e4314c4a8480daa10da" protocol=ttrpc version=3 Dec 16 12:49:20.239700 systemd[1]: Started cri-containerd-e25bd341ccd1aab4859c208c61e8b542524a8a4f060010256bf620c54a2e86ed.scope - libcontainer container e25bd341ccd1aab4859c208c61e8b542524a8a4f060010256bf620c54a2e86ed. Dec 16 12:49:20.254000 audit: BPF prog-id=262 op=LOAD Dec 16 12:49:20.255000 audit: BPF prog-id=263 op=LOAD Dec 16 12:49:20.255000 audit[5393]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2938 pid=5393 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:20.255000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532356264333431636364316161623438353963323038633631653862 Dec 16 12:49:20.255000 audit: BPF prog-id=263 op=UNLOAD Dec 16 12:49:20.255000 audit[5393]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2938 pid=5393 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:20.255000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532356264333431636364316161623438353963323038633631653862 Dec 16 12:49:20.255000 audit: BPF prog-id=264 op=LOAD Dec 16 12:49:20.255000 audit[5393]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2938 pid=5393 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:20.255000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532356264333431636364316161623438353963323038633631653862 Dec 16 12:49:20.255000 audit: BPF prog-id=265 op=LOAD Dec 16 12:49:20.255000 audit[5393]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2938 pid=5393 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:20.255000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532356264333431636364316161623438353963323038633631653862 Dec 16 12:49:20.255000 audit: BPF prog-id=265 op=UNLOAD Dec 16 12:49:20.255000 audit[5393]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2938 pid=5393 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:20.255000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532356264333431636364316161623438353963323038633631653862 Dec 16 12:49:20.256000 audit: BPF prog-id=264 op=UNLOAD Dec 16 12:49:20.256000 audit[5393]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2938 pid=5393 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:20.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532356264333431636364316161623438353963323038633631653862 Dec 16 12:49:20.256000 audit: BPF prog-id=266 op=LOAD Dec 16 12:49:20.256000 audit[5393]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2938 pid=5393 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:20.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532356264333431636364316161623438353963323038633631653862 Dec 16 12:49:20.283484 containerd[1624]: time="2025-12-16T12:49:20.283442630Z" level=info msg="StartContainer for \"e25bd341ccd1aab4859c208c61e8b542524a8a4f060010256bf620c54a2e86ed\" returns successfully" Dec 16 12:49:20.429578 kubelet[2807]: E1216 12:49:20.429502 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sdw6q" podUID="db9e60f0-9cc3-4000-b32c-9ba313d4b676" Dec 16 12:49:23.256870 systemd[1]: cri-containerd-b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2.scope: Deactivated successfully. Dec 16 12:49:23.257916 systemd[1]: cri-containerd-b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2.scope: Consumed 2.128s CPU time, 41.2M memory peak, 32.2M read from disk. Dec 16 12:49:23.263676 kernel: kauditd_printk_skb: 40 callbacks suppressed Dec 16 12:49:23.263767 kernel: audit: type=1334 audit(1765889363.258:875): prog-id=267 op=LOAD Dec 16 12:49:23.258000 audit: BPF prog-id=267 op=LOAD Dec 16 12:49:23.263883 containerd[1624]: time="2025-12-16T12:49:23.260499017Z" level=info msg="received container exit event container_id:\"b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2\" id:\"b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2\" pid:2633 exit_status:1 exited_at:{seconds:1765889363 nanos:260094394}" Dec 16 12:49:23.263000 audit: BPF prog-id=93 op=UNLOAD Dec 16 12:49:23.269902 kernel: audit: type=1334 audit(1765889363.263:876): prog-id=93 op=UNLOAD Dec 16 12:49:23.269983 kernel: audit: type=1334 audit(1765889363.264:877): prog-id=98 op=UNLOAD Dec 16 12:49:23.264000 audit: BPF prog-id=98 op=UNLOAD Dec 16 12:49:23.271569 kernel: audit: type=1334 audit(1765889363.264:878): prog-id=102 op=UNLOAD Dec 16 12:49:23.264000 audit: BPF prog-id=102 op=UNLOAD Dec 16 12:49:23.288434 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2-rootfs.mount: Deactivated successfully. Dec 16 12:49:24.211809 kubelet[2807]: I1216 12:49:24.211761 2807 scope.go:117] "RemoveContainer" containerID="b2c5e7137d42130a1a7df20ea9a621ae4b05626598f00bce0414e28150fda5f2" Dec 16 12:49:24.214001 containerd[1624]: time="2025-12-16T12:49:24.213945344Z" level=info msg="CreateContainer within sandbox \"c136070a436d7c9a0b4df15a4476da693a1c9a34fe0ad76af60a348d79c2118e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 16 12:49:24.226036 containerd[1624]: time="2025-12-16T12:49:24.225195262Z" level=info msg="Container 2a62cdb7d92c3de8cd6bc80738b41990bdbc13e110ed5e33c504f7a8f61875fe: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:49:24.231274 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2767858562.mount: Deactivated successfully. Dec 16 12:49:24.234822 containerd[1624]: time="2025-12-16T12:49:24.234766515Z" level=info msg="CreateContainer within sandbox \"c136070a436d7c9a0b4df15a4476da693a1c9a34fe0ad76af60a348d79c2118e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"2a62cdb7d92c3de8cd6bc80738b41990bdbc13e110ed5e33c504f7a8f61875fe\"" Dec 16 12:49:24.235205 containerd[1624]: time="2025-12-16T12:49:24.235179013Z" level=info msg="StartContainer for \"2a62cdb7d92c3de8cd6bc80738b41990bdbc13e110ed5e33c504f7a8f61875fe\"" Dec 16 12:49:24.240624 containerd[1624]: time="2025-12-16T12:49:24.240521715Z" level=info msg="connecting to shim 2a62cdb7d92c3de8cd6bc80738b41990bdbc13e110ed5e33c504f7a8f61875fe" address="unix:///run/containerd/s/182570ef8b40130a787c5e0caea0d640201bd5733f6bcc2322f0eec1b46193c6" protocol=ttrpc version=3 Dec 16 12:49:24.268773 systemd[1]: Started cri-containerd-2a62cdb7d92c3de8cd6bc80738b41990bdbc13e110ed5e33c504f7a8f61875fe.scope - libcontainer container 2a62cdb7d92c3de8cd6bc80738b41990bdbc13e110ed5e33c504f7a8f61875fe. Dec 16 12:49:24.286631 kernel: audit: type=1334 audit(1765889364.283:879): prog-id=268 op=LOAD Dec 16 12:49:24.283000 audit: BPF prog-id=268 op=LOAD Dec 16 12:49:24.284000 audit: BPF prog-id=269 op=LOAD Dec 16 12:49:24.290581 kernel: audit: type=1334 audit(1765889364.284:880): prog-id=269 op=LOAD Dec 16 12:49:24.284000 audit[5436]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2489 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:24.306047 kernel: audit: type=1300 audit(1765889364.284:880): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2489 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:24.306096 kernel: audit: type=1327 audit(1765889364.284:880): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261363263646237643932633364653863643662633830373338623431 Dec 16 12:49:24.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261363263646237643932633364653863643662633830373338623431 Dec 16 12:49:24.308632 kernel: audit: type=1334 audit(1765889364.284:881): prog-id=269 op=UNLOAD Dec 16 12:49:24.284000 audit: BPF prog-id=269 op=UNLOAD Dec 16 12:49:24.284000 audit[5436]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:24.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261363263646237643932633364653863643662633830373338623431 Dec 16 12:49:24.284000 audit: BPF prog-id=270 op=LOAD Dec 16 12:49:24.284000 audit[5436]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2489 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:24.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261363263646237643932633364653863643662633830373338623431 Dec 16 12:49:24.285000 audit: BPF prog-id=271 op=LOAD Dec 16 12:49:24.285000 audit[5436]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2489 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:24.285000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261363263646237643932633364653863643662633830373338623431 Dec 16 12:49:24.285000 audit: BPF prog-id=271 op=UNLOAD Dec 16 12:49:24.285000 audit[5436]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:24.285000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261363263646237643932633364653863643662633830373338623431 Dec 16 12:49:24.285000 audit: BPF prog-id=270 op=UNLOAD Dec 16 12:49:24.285000 audit[5436]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:24.285000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261363263646237643932633364653863643662633830373338623431 Dec 16 12:49:24.285000 audit: BPF prog-id=272 op=LOAD Dec 16 12:49:24.285000 audit[5436]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2489 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:24.285000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3261363263646237643932633364653863643662633830373338623431 Dec 16 12:49:24.317556 kernel: audit: type=1300 audit(1765889364.284:881): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2489 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:49:24.328058 containerd[1624]: time="2025-12-16T12:49:24.327469766Z" level=info msg="StartContainer for \"2a62cdb7d92c3de8cd6bc80738b41990bdbc13e110ed5e33c504f7a8f61875fe\" returns successfully" Dec 16 12:49:24.414021 kubelet[2807]: E1216 12:49:24.413922 2807 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78c5bf4bd4-c4c2s" podUID="373b77f0-3f56-4579-96a0-a033d280b187"